WorldWideScience

Sample records for values springer series

  1. News from the Library: Springer e-books in Engineering and Mathematics - now at your desktop!

    CERN Multimedia

    CERN Library

    2013-01-01

    Users obviously expect library collections to be shaped to meet their information needs. Today, online systems have replaced the good old "Book Suggestion Register" to help readers provide libraries with input.   Moreover, e-books lend themselves by their nature to a trial period, whereby the needs and the potential usage of collections can be effectively measured. In this sense, the trial of Springer e-books on engineering done in October-November last year was extremely fruitful because it showed the interest of the community and provided us with essential feedback. Following that trial, the Library now provides access to a collection of Springer e-books on Engineering published in 2012 and 2013 and also on Mathematics (2010-2013). These collections complement the Springer Physics and Astronomy and "Lecture Notes in Physics" e-books series. As usual, your feedback is most welcome! Please contact us by e-mail.

  2. Special values of the hypergeometric series

    CERN Document Server

    Ebisu, Akihito

    2017-01-01

    In this paper, the author presents a new method for finding identities for hypergeoemtric series, such as the (Gauss) hypergeometric series, the generalized hypergeometric series and the Appell-Lauricella hypergeometric series. Furthermore, using this method, the author gets identities for the hypergeometric series F(a,b;c;x) and shows that values of F(a,b;c;x) at some points x can be expressed in terms of gamma functions, together with certain elementary functions. The author tabulates the values of F(a,b;c;x) that can be obtained with this method and finds that this set includes almost all previously known values and many previously unknown values.

  3. Springer Publishing book booth | 8-9 October

    CERN Multimedia

    CERN Library

    2015-01-01

    Continuing the spirit of the CERN Book Fairs of the past years, Springer Publishing will have a book booth in the foyer of the Main Building, from 8 to 9 October. Some of the latest titles in particle physics and related fields will be on sale.   For the occasion, Professor Ugo Amaldi will present his new book “Particle Accelerators: From Big Bang Physics to Hadron Therapy” on Thursday, 8 October at 5 p.m. in Room F (Charpak room). The presentation will take place in the framework of the Italian Teachers week and will be followed by a signing session. A special highlight at the Springer booth will be the presentation of the CERN-sponsored Open Access book: “J Rafelski (ed): Melting Hadrons, Boiling Quarks - From Hagedorn Temperature to Ultra-Relativistic Heavy-Ion Collisions at CERN; With a Tribute to Rolf Hagedorn”.

  4. Filtrations on Springer fiber cohomology and Kostka polynomials

    Science.gov (United States)

    Bellamy, Gwyn; Schedler, Travis

    2018-03-01

    We prove a conjecture which expresses the bigraded Poisson-de Rham homology of the nilpotent cone of a semisimple Lie algebra in terms of the generalized (one-variable) Kostka polynomials, via a formula suggested by Lusztig. This allows us to construct a canonical family of filtrations on the flag variety cohomology, and hence on irreducible representations of the Weyl group, whose Hilbert series are given by the generalized Kostka polynomials. We deduce consequences for the cohomology of all Springer fibers. In particular, this computes the grading on the zeroth Poisson homology of all classical finite W-algebras, as well as the filtration on the zeroth Hochschild homology of all quantum finite W-algebras, and we generalize to all homology degrees. As a consequence, we deduce a conjecture of Proudfoot on symplectic duality, relating in type A the Poisson homology of Slodowy slices to the intersection cohomology of nilpotent orbit closures. In the last section, we give an analogue of our main theorem in the setting of mirabolic D-modules.

  5. Integer-valued time series

    NARCIS (Netherlands)

    van den Akker, R.

    2007-01-01

    This thesis adresses statistical problems in econometrics. The first part contributes statistical methodology for nonnegative integer-valued time series. The second part of this thesis discusses semiparametric estimation in copula models and develops semiparametric lower bounds for a large class of

  6. Springer Handbook of Condensed Matter and Materials Data

    CERN Document Server

    Martienssen, Werner

    2005-01-01

    Condensed Matter and Materials Science are two of the most active fields of applied physics, with a stream of discoveries in areas from superconductivity and magnetism to the optical, electronic and mechanical properties of materials. While a huge amount of data has been compiled and spread over numerous reference works, no single volume compiles the most used information. Springer Handbook of Condensed Matter and Materials Data provides a concise compilation of data and functional relationships from the fields of solid-state physics and materials in this 1200-page volume. The data, encapsulated in over 750 tables and 1025 illustrations, have been selected and extracted primarily from the extensive high-quality data collection Landolt-Börnstein and also from other systematic data sources and recent publications of physical and technical property data. Many chapters are authored by Landolt-Börnstein editors, including the editors of this Springer Handbook. Key Topics Fundamental Constants The International S...

  7. Springer Handbook of Acoustics

    CERN Document Server

    Rossing, Thomas D

    2007-01-01

    Acoustics, the science of sound, has developed into a broad interdisciplinary field encompassing the academic disciplines of physics, engineering, psychology, speech, audiology, music, architecture, physiology, neuroscience, and others. The Springer Handbook of Acoustics is an unparalleled modern handbook reflecting this richly interdisciplinary nature edited by one of the acknowledged masters in the field, Thomas Rossing. Researchers and students benefit from the comprehensive contents spanning: animal acoustics including infrasound and ultrasound, environmental noise control, music and human speech and singing, physiological and psychological acoustics, architectural acoustics, physical and engineering acoustics, signal processing, medical acoustics, and ocean acoustics. This handbook reviews the most important areas of acoustics, with emphasis on current research. The authors of the various chapters are all experts in their fields. Each chapter is richly illustrated with figures and tables. The latest rese...

  8. Time Series Forecasting with Missing Values

    Directory of Open Access Journals (Sweden)

    Shin-Fu Wu

    2015-11-01

    Full Text Available Time series prediction has become more popular in various kinds of applications such as weather prediction, control engineering, financial analysis, industrial monitoring, etc. To deal with real-world problems, we are often faced with missing values in the data due to sensor malfunctions or human errors. Traditionally, the missing values are simply omitted or replaced by means of imputation methods. However, omitting those missing values may cause temporal discontinuity. Imputation methods, on the other hand, may alter the original time series. In this study, we propose a novel forecasting method based on least squares support vector machine (LSSVM. We employ the input patterns with the temporal information which is defined as local time index (LTI. Time series data as well as local time indexes are fed to LSSVM for doing forecasting without imputation. We compare the forecasting performance of our method with other imputation methods. Experimental results show that the proposed method is promising and is worth further investigations.

  9. Springer handbook of acoustics

    CERN Document Server

    2014-01-01

    Acoustics, the science of sound, has developed into a broad interdisciplinary field encompassing the academic disciplines of physics, engineering, psychology, speech, audiology, music, architecture, physiology, neuroscience, and electronics. The Springer Handbook of Acoustics is also in his 2nd edition an unparalleled modern handbook reflecting this richly interdisciplinary nature edited by one of the acknowledged masters in the field, Thomas Rossing. Researchers and students benefit from the comprehensive contents. This new edition of the Handbook features over 11 revised and expanded chapters, new illustrations, and 2 new chapters covering microphone arrays  and acoustic emission.  Updated chapters contain the latest research and applications in, e.g. sound propagation in the atmosphere, nonlinear acoustics in fluids, building and concert hall acoustics, signal processing, psychoacoustics, computer music, animal bioacousics, sound intensity, modal acoustics as well as new chapters on microphone arrays an...

  10. Springer handbook of spacetime

    CERN Document Server

    Petkov, Vesselin

    2014-01-01

    The Springer Handbook of Spacetime is dedicated to the ground-breaking paradigm shifts embodied in the two relativity theories, and describes in detail the profound reshaping of physical sciences they ushered in. It includes in a single volume chapters on foundations, on the underlying mathematics, on physical and astrophysical implications, experimental evidence and cosmological predictions, as well as chapters on efforts to unify general relativity and quantum physics. The Handbook can be used as a desk reference by researchers in a wide variety of fields, not only by specialists in relativity but also by researchers in related areas that either grew out of, or are deeply influenced by, the two relativity theories: cosmology, astronomy and astrophysics, high energy physics, quantum field theory, mathematics, and philosophy of science. It should also serve as a valuable resource for graduate students and young researchers entering these areas, and for instructors who teach courses on these subjects. The Han...

  11. Springer handbook of robotics

    CERN Document Server

    Khatib, Oussama

    2016-01-01

    The second edition of this handbook provides a state-of-the-art cover view on the various aspects in the rapidly developing field of robotics. Reaching for the human frontier, robotics is vigorously engaged in the growing challenges of new emerging domains. Interacting, exploring, and working with humans, the new generation of robots will increasingly touch people and their lives. The credible prospect of practical robots among humans is the result of the scientific endeavour of a half a century of robotic developments that established robotics as a modern scientific discipline. The ongoing vibrant expansion and strong growth of the field during the last decade has fueled this second edition of the Springer Handbook of Robotics. The first edition of the handbook soon became a landmark in robotics publishing and won the American Association of Publishers PROSE Award for Excellence in Physical Sciences & Mathematics as well as the organization’s Award for Engineering & Technology. The second edition o...

  12. Time Series Forecasting with Missing Values

    OpenAIRE

    Shin-Fu Wu; Chia-Yung Chang; Shie-Jue Lee

    2015-01-01

    Time series prediction has become more popular in various kinds of applications such as weather prediction, control engineering, financial analysis, industrial monitoring, etc. To deal with real-world problems, we are often faced with missing values in the data due to sensor malfunctions or human errors. Traditionally, the missing values are simply omitted or replaced by means of imputation methods. However, omitting those missing values may cause temporal discontinuity. Imputation methods, o...

  13. Tijdschriften na 1 januari 2005. Elsevier Science, Wiley, Springer/Kluwer

    NARCIS (Netherlands)

    Klugkist, A.C.; Laarhoven, Peter van

    2004-01-01

    In aflevering 3 van deze jaargang van Pictogram hebben wij de aandacht gericht op de toen komende onderhandelingen met Elsevier Science, Wiley en Springer/Kluwer over de verlenging van de zogenaamde pakketlicenties. Zoals de meesten bekend is, zijn de faculteiten en de UB sinds 2000/2001 geabonneerd

  14. Springer handbook of bio-/neuroinformatics

    CERN Document Server

    2014-01-01

    The Springer Handbook of Bio-/Neuro-Informatics is the first published book in one volume that explains together the basics and the state-of-the-art of two major science disciplines in their interaction and mutual relationship, namely: information sciences, bioinformatics and neuroinformatics. Bioinformatics is the area of science which is concerned with the information processes in biology and the development and applications of methods, tools and systems for storing and processing of biological information thus facilitating new knowledge discovery. Neuroinformatics is the area of science which is concerned with the information processes in biology and the development and applications of methods, tools and systems for storing and processing of biological information thus facilitating new knowledge discovery. The text contains 62 chapters organized in 12 parts, 6 of them covering topics from information science and bioinformatics, and 6 cover topics from information science and neuroinformatics. Each chapter ...

  15. Symposium in Honour of T.A. Springer

    CERN Document Server

    Hesselink, Wim; Kallen, Wilberd; Strooker, Jan

    1987-01-01

    From 1-4 April 1986 a Symposium on Algebraic Groups was held at the University of Utrecht, The Netherlands, in celebration of the 350th birthday of the University and the 60th of T.A. Springer. Recognized leaders in the field of algebraic groups and related areas gave lectures which covered wide and central areas of mathematics. Though the fourteen papers in this volume are mostly original research contributions, some survey articles are included. Centering on the Symposium subject, such diverse topics are covered as Discrete Subgroups of Lie Groups, Invariant Theory, D-modules, Lie Algebras, Special Functions, Group Actions on Varieties.

  16. Springer Publishing Booth | 4-5 October

    CERN Multimedia

    2016-01-01

    In the spirit of continuation of the CERN Book Fairs of the past years, Springer Nature will be present with a book and journal booth on October 4th and 5th, located as usual in the foyer of the Main Building. Some of the latest titles in particle physics and related fields will be on sale.   You are cordially invited to come to the booth to meet Heike Klingebiel (Licensing Manager / Library Sales), Hisako Niko (Publishing Editor) and Christian Caron (Publishing Editor). In particular, information about the new Nano database – nanomaterial and device profiles from high-impact journals and patents, manually abstracted, curated and updated by nanotechnology experts – will be available. The database is accessible here: http://nano.nature.com/. 

  17. Otolith growth of Springer's demoiselle, Chrysiptera springeri (Pomacentridae, Allen & Lubbock), on a protected and non-protected coral reef

    DEFF Research Database (Denmark)

    Retzel, A.; Hansen, A.D.; Grønkjær, P.

    2007-01-01

    The structural complexity of coral reefs is important for their function as shelter and feeding habitats for coral reef fishes, but physical disturbance by human activities often reduce complexity of the reefs by selectively destroying fragile and more complex coral species. The damselfish Springer......'s demoiselle Chrysiptera springeri primarily utilize complex coral heads for shelter and are hence vulnerable to human disturbance. In order to evaluate the potential effect of habitat degradation on juvenile fish growth, coral reef cover, fish age at settling and otolith growth, juvenile Springer's demoiselle...... was investigated on a protected and non-protected coral reef in Darvel Bay, Borneo. The protected reef had higher coverage of complex branching corals and exhibited a more complex 3-dimensional structure than the non-protected reef. Springer's demoiselle settled at the same age on non-protected and protected reefs...

  18. Springer Handbook of Crystal Growth

    CERN Document Server

    Dhanaraj, Govindhan; Prasad, Vishwanath; Dudley, Michael

    2010-01-01

    Over the years, many successful attempts have been made to describe the art and science of crystal growth. Most modern advances in semiconductor and optical devices would not have been possible without the development of many elemental, binary, ternary, and other compound crystals of varying properties and large sizes. The objective of the Springer Handbook of Crystal Growth is to present state-of-the-art knowledge of both bulk and thin-film crystal growth. The goal is to make readers understand the basics of the commonly employed growth processes, materials produced, and defects generated. Almost 100 leading scientists, researchers, and engineers from 22 different countries from academia and industry have been selected to write chapters on the topics of their expertise. They have written 52 chapters on the fundamentals of bulk crystal growth from the melt, solution, and vapor, epitaxial growth, modeling of growth processes and defects, techniques of defect characterization as well as some contemporary specia...

  19. Exercise-induced hyperthermia syndrome (canine stress syndrome in four related male English springer spaniels

    Directory of Open Access Journals (Sweden)

    Thrift E

    2017-09-01

    Full Text Available Elizabeth Thrift,1 Justin A Wimpole,2 Georgina Child,2 Narelle Brown,1 Barbara Gandolfi,3 Richard Malik4 1Animal Referral Hospital, 2Small Animal Specialist Hospital, Sydney, NSW, Australia; 3Veterinary Medicine and Surgery, College of Veterinary Medicine, University of Missouri, Columbia, MO, USA; 4Centre for Veterinary Education, University of Sydney, Sydney, NSW, Australia Objective: This retrospective study describes the signalment, clinical presentation, diagnostic findings, and mode of inheritance in four young male English springer spaniel dogs with presumptive canine stress syndrome.Materials and methods: Appropriate cases were located through medical searches of medical records of two large private referral centers. Inclusion criteria comprised of English springer spaniel dogs with tachypnea and hyperthermia that subsequently developed weakness or collapse, with or without signs of hemorrhage, soon after a period of mild-to-moderate exercise. The pedigrees of the four affected dogs, as well as eleven related English springer spaniels, were then analyzed to determine a presumptive mode of genetic inheritance.Results: Four dogs met the inclusion criteria. All four were male, suggesting the possibility of a recessive sex-linked heritable disorder. Pedigree analysis suggests that more dogs may be potentially affected, although these dogs may have never had the concurrent triggering drug/activity/event to precipitate the clinical syndrome. There was complete resolution of clinical signs in three of the four dogs with aggressive symptomatic and supportive therapy, with one dog dying during treatment.Conclusion: Dogs with canine stress syndrome have the potential for rapid recovery if treated aggressively and the complications of the disease (eg, coagulopathy are anticipated. All four dogs were male, suggesting the possibility of a recessive sex-linked mode of inheritance. Further genetic analyses should be strongly considered by those

  20. Springer handbook of lasers and optics

    CERN Document Server

    2012-01-01

    The Springer Handbook of Lasers and Optics provides fast, up-to-date, comprehensive and authoritative coverage of the wide fields of optics and lasers. It is written for daily use in the office or laboratory and offers explanatory text, data, and references needed for anyone working with lasers and optical instruments. This second edition features numerous updates and additions. Especially four new chapters on Fiber Optics, Integrated Optics, Frequency Combs, and Interferometry reflect the major changes. In addition, chapters Optical Materials and Their Properties, Optical Detectors, Nanooptics, and Optics far Beyond the Diffraction Limit have been thoroughly revised and updated. The now 25 chapters are grouped into four parts which cover basic principles and materials, fabrication and properties of optical components, coherent and incoherent light sources, and, finally, selected applications and special fields such as terahertz photonics, x-ray optics and holography. Each chapter is authored by respected exp...

  1. Springer handbook of model-based science

    CERN Document Server

    Bertolotti, Tommaso

    2017-01-01

    The handbook offers the first comprehensive reference guide to the interdisciplinary field of model-based reasoning. It highlights the role of models as mediators between theory and experimentation, and as educational devices, as well as their relevance in testing hypotheses and explanatory functions. The Springer Handbook merges philosophical, cognitive and epistemological perspectives on models with the more practical needs related to the application of this tool across various disciplines and practices. The result is a unique, reliable source of information that guides readers toward an understanding of different aspects of model-based science, such as the theoretical and cognitive nature of models, as well as their practical and logical aspects. The inferential role of models in hypothetical reasoning, abduction and creativity once they are constructed, adopted, and manipulated for different scientific and technological purposes is also discussed. Written by a group of internationally renowned experts in ...

  2. Recurrent Neural Networks for Multivariate Time Series with Missing Values.

    Science.gov (United States)

    Che, Zhengping; Purushotham, Sanjay; Cho, Kyunghyun; Sontag, David; Liu, Yan

    2018-04-17

    Multivariate time series data in practical applications, such as health care, geoscience, and biology, are characterized by a variety of missing values. In time series prediction and other related tasks, it has been noted that missing values and their missing patterns are often correlated with the target labels, a.k.a., informative missingness. There is very limited work on exploiting the missing patterns for effective imputation and improving prediction performance. In this paper, we develop novel deep learning models, namely GRU-D, as one of the early attempts. GRU-D is based on Gated Recurrent Unit (GRU), a state-of-the-art recurrent neural network. It takes two representations of missing patterns, i.e., masking and time interval, and effectively incorporates them into a deep model architecture so that it not only captures the long-term temporal dependencies in time series, but also utilizes the missing patterns to achieve better prediction results. Experiments of time series classification tasks on real-world clinical datasets (MIMIC-III, PhysioNet) and synthetic datasets demonstrate that our models achieve state-of-the-art performance and provide useful insights for better understanding and utilization of missing values in time series analysis.

  3. Springer handbook of mechanical engineering

    Energy Technology Data Exchange (ETDEWEB)

    Grote, Karl-Heinrich [Magdeburg Univ. (Germany). Dept. of Mechanical Engineering; Antonsson, Erik K. (eds.) [California Inst. of Technology (CALTEC), Pasadena, CA (United States). Dept. of Mechanical Engineering

    2009-07-01

    Mechanical Engineering is a professional engineering discipline which involves the application of principles of physics, design, manufacturing and maintenance of mechanical systems. It requires a solid understanding of the key concepts including mechanics, kinematics, thermodynamics and energy. Mechanical engineers use these principles and others in the design and analysis of automobiles, aircrafts, heating and cooling systems, industrial equipment and machinery. In addition to these main areas, specialized fields are necessary to prepare future engineers for their positions in industry, such as mechatronics and robotics, transportation and logistics, fuel technology, automotive engineering, biomechanics, vibration, optics and others. Accordingly, the Springer Handbook of Mechanical Engineering devotes its contents to all areas of interest for the practicing engineer as well as for the student at various levels and educational institutions. Authors from all over the world have contributed with their expertise and support the globally working engineer in finding a solution for today's mechanical engineering problems. Each subject is discussed in detail and supported by numerous figures and tables. DIN standards are retained throughout and ISO equivalents are given where possible. The text offers a concise but detailed and authoritative treatment of the topics with full references. (orig.)

  4. Hierarchical Hidden Markov Models for Multivariate Integer-Valued Time-Series

    DEFF Research Database (Denmark)

    Catania, Leopoldo; Di Mari, Roberto

    2018-01-01

    We propose a new flexible dynamic model for multivariate nonnegative integer-valued time-series. Observations are assumed to depend on the realization of two additional unobserved integer-valued stochastic variables which control for the time-and cross-dependence of the data. An Expectation......-Maximization algorithm for maximum likelihood estimation of the model's parameters is derived. We provide conditional and unconditional (cross)-moments implied by the model, as well as the limiting distribution of the series. A Monte Carlo experiment investigates the finite sample properties of our estimation...

  5. Critical values for unit root tests in seasonal time series

    NARCIS (Netherlands)

    Ph.H.B.F. Franses (Philip Hans); B. Hobijn (Bart)

    1997-01-01

    textabstractIn this paper, we present tables with critical values for a variety of tests for seasonal and non-seasonal unit roots in seasonal time series. We consider (extensions of) the Hylleberg et al. and Osborn et al. test procedures. These extensions concern time series with increasing seasonal

  6. A time series model: First-order integer-valued autoregressive (INAR(1))

    Science.gov (United States)

    Simarmata, D. M.; Novkaniza, F.; Widyaningsih, Y.

    2017-07-01

    Nonnegative integer-valued time series arises in many applications. A time series model: first-order Integer-valued AutoRegressive (INAR(1)) is constructed by binomial thinning operator to model nonnegative integer-valued time series. INAR (1) depends on one period from the process before. The parameter of the model can be estimated by Conditional Least Squares (CLS). Specification of INAR(1) is following the specification of (AR(1)). Forecasting in INAR(1) uses median or Bayesian forecasting methodology. Median forecasting methodology obtains integer s, which is cumulative density function (CDF) until s, is more than or equal to 0.5. Bayesian forecasting methodology forecasts h-step-ahead of generating the parameter of the model and parameter of innovation term using Adaptive Rejection Metropolis Sampling within Gibbs sampling (ARMS), then finding the least integer s, where CDF until s is more than or equal to u . u is a value taken from the Uniform(0,1) distribution. INAR(1) is applied on pneumonia case in Penjaringan, Jakarta Utara, January 2008 until April 2016 monthly.

  7. Springer-Verlag history of a scientific publishing house

    CERN Document Server

    Sarkowski, Heinz

    1996-01-01

    This book describes the fortunes and activities of one of the few specialist publishing houses still in the hands of the same family that established it over 150 years ago and with it gives a portrayal of those members who directed it. In doing so it covers a period of momentous historical events that directly and indirectly shaped the firm's actions and achievements. But this volume tells not only, in word and picture, the story of Springer-Verlag but also, interwoven with it, the story of publishing in Germany over the span of a hundred years. The text, densely packed with carefully researched facts and figures, is illuminated and supplemented by many illustrations whose captions, together with the author's notes, contain a wealth of important and interesting information. A second volume contains the history of the publishing house from 1945 to 1992.

  8. Extreme Value Theory Applied to the Millennial Sunspot Number Series

    Science.gov (United States)

    Acero, F. J.; Gallego, M. C.; García, J. A.; Usoskin, I. G.; Vaquero, J. M.

    2018-01-01

    In this work, we use two decadal sunspot number series reconstructed from cosmogenic radionuclide data (14C in tree trunks, SN 14C, and 10Be in polar ice, SN 10Be) and the extreme value theory to study variability of solar activity during the last nine millennia. The peaks-over-threshold technique was used to compute, in particular, the shape parameter of the generalized Pareto distribution for different thresholds. Its negative value implies an upper bound of the extreme SN 10Be and SN 14C timeseries. The return level for 1000 and 10,000 years were estimated leading to values lower than the maximum observed values, expected for the 1000 year, but not for the 10,000 year return levels, for both series. A comparison of these results with those obtained using the observed sunspot numbers from telescopic observations during the last four centuries suggests that the main characteristics of solar activity have already been recorded in the telescopic period (from 1610 to nowadays) which covers the full range of solar variability from a Grand minimum to a Grand maximum.

  9. Values in Higher Education. The Wilson Lecture Series.

    Science.gov (United States)

    Wilson, O. Meredith

    The text of a lecture in the University of Arizona Wilson Lecture Series on values in higher education is presented, with responses by Richard H. Gallagher, Jeanne McRae McCarthy, and Raymond H. Thompson. The theme of the talk is that man is by evolution and by necessity a thinking animal, who now finds himself in a technologically dependent…

  10. Springer handbook of atomic, molecular, and optical physics

    CERN Document Server

    Cassar, Mark M

    2006-01-01

    This Springer Handbook of Atomic, Molecular, and Optical Physics comprises a comprehensive reference source that unifies the entire fields of atomic, molecular, and optical (AMO) physics, assembling the principal ideas, techniques and results of the field from atomic spectroscopy to applications in comets. Its 92 chapters are written by over 100 authors, all leaders in their respective disciplines. Carefully edited to ensure uniform coverage and style, with extensive cross references, and acting as a guide to the primary research literature, it is both a source of information and an inspiration for graduate students and other researchers new to the field. Relevant diagrams, graphs, and tables of data are provided throughout the text. Substantially updated and expanded since the 1996 edition and published in conjunction with the 2005 World Year of Physics (commemorating Einstein’s 1905 "miracle year"), it contains several entirely new chapters covering current areas of great research interest, such as Bose �...

  11. Hierarchical time series bottom-up approach for forecast the export value in Central Java

    Science.gov (United States)

    Mahkya, D. A.; Ulama, B. S.; Suhartono

    2017-10-01

    The purpose of this study is Getting the best modeling and predicting the export value of Central Java using a Hierarchical Time Series. The export value is one variable injection in the economy of a country, meaning that if the export value of the country increases, the country’s economy will increase even more. Therefore, it is necessary appropriate modeling to predict the export value especially in Central Java. Export Value in Central Java are grouped into 21 commodities with each commodity has a different pattern. One approach that can be used time series is a hierarchical approach. Hierarchical Time Series is used Buttom-up. To Forecast the individual series at all levels using Autoregressive Integrated Moving Average (ARIMA), Radial Basis Function Neural Network (RBFNN), and Hybrid ARIMA-RBFNN. For the selection of the best models used Symmetric Mean Absolute Percentage Error (sMAPE). Results of the analysis showed that for the Export Value of Central Java, Bottom-up approach with Hybrid ARIMA-RBFNN modeling can be used for long-term predictions. As for the short and medium-term predictions, it can be used a bottom-up approach RBFNN modeling. Overall bottom-up approach with RBFNN modeling give the best result.

  12. Two-pass imputation algorithm for missing value estimation in gene expression time series.

    Science.gov (United States)

    Tsiporkova, Elena; Boeva, Veselka

    2007-10-01

    Gene expression microarray experiments frequently generate datasets with multiple values missing. However, most of the analysis, mining, and classification methods for gene expression data require a complete matrix of gene array values. Therefore, the accurate estimation of missing values in such datasets has been recognized as an important issue, and several imputation algorithms have already been proposed to the biological community. Most of these approaches, however, are not particularly suitable for time series expression profiles. In view of this, we propose a novel imputation algorithm, which is specially suited for the estimation of missing values in gene expression time series data. The algorithm utilizes Dynamic Time Warping (DTW) distance in order to measure the similarity between time expression profiles, and subsequently selects for each gene expression profile with missing values a dedicated set of candidate profiles for estimation. Three different DTW-based imputation (DTWimpute) algorithms have been considered: position-wise, neighborhood-wise, and two-pass imputation. These have initially been prototyped in Perl, and their accuracy has been evaluated on yeast expression time series data using several different parameter settings. The experiments have shown that the two-pass algorithm consistently outperforms, in particular for datasets with a higher level of missing entries, the neighborhood-wise and the position-wise algorithms. The performance of the two-pass DTWimpute algorithm has further been benchmarked against the weighted K-Nearest Neighbors algorithm, which is widely used in the biological community; the former algorithm has appeared superior to the latter one. Motivated by these findings, indicating clearly the added value of the DTW techniques for missing value estimation in time series data, we have built an optimized C++ implementation of the two-pass DTWimpute algorithm. The software also provides for a choice between three different

  13. Introduction to time series and forecasting

    CERN Document Server

    Brockwell, Peter J

    2016-01-01

    This book is aimed at the reader who wishes to gain a working knowledge of time series and forecasting methods as applied to economics, engineering and the natural and social sciences. It assumes knowledge only of basic calculus, matrix algebra and elementary statistics. This third edition contains detailed instructions for the use of the professional version of the Windows-based computer package ITSM2000, now available as a free download from the Springer Extras website. The logic and tools of time series model-building are developed in detail. Numerous exercises are included and the software can be used to analyze and forecast data sets of the user's own choosing. The book can also be used in conjunction with other time series packages such as those included in R. The programs in ITSM2000 however are menu-driven and can be used with minimal investment of time in the computational details. The core of the book covers stationary processes, ARMA and ARIMA processes, multivariate time series and state-space mod...

  14. Detection of Outliers and Imputing of Missing Values for Water Quality UV-VIS Absorbance Time Series

    Directory of Open Access Journals (Sweden)

    Leonardo Plazas-Nossa

    2017-01-01

    Full Text Available Context: The UV-Vis absorbance collection using online optical captors for water quality detection may yield outliers and/or missing values. Therefore, data pre-processing is a necessary pre-requisite to monitoring data processing. Thus, the aim of this study is to propose a method that detects and removes outliers as well as fills gaps in time series. Method: Outliers are detected using Winsorising procedure and the application of the Discrete Fourier Transform (DFT and the Inverse of Fast Fourier Transform (IFFT to complete the time series. Together, these tools were used to analyse a case study comprising three sites in Colombia ((i Bogotá D.C. Salitre-WWTP (Waste Water Treatment Plant, influent; (ii Bogotá D.C. Gibraltar Pumping Station (GPS; and, (iii Itagüí, San Fernando-WWTP, influent (Medellín metropolitan area analysed via UV-Vis (Ultraviolet and Visible spectra. Results: Outlier detection with the proposed method obtained promising results when window parameter values are small and self-similar, despite that the three time series exhibited different sizes and behaviours. The DFT allowed to process different length gaps having missing values. To assess the validity of the proposed method, continuous subsets (a section of the absorbance time series without outlier or missing values were removed from the original time series obtaining an average 12% error rate in the three testing time series. Conclusions: The application of the DFT and the IFFT, using the 10% most important harmonics of useful values, can be useful for its later use in different applications, specifically for time series of water quality and quantity in urban sewer systems. One potential application would be the analysis of dry weather interesting to rain events, a feat achieved by detecting values that correspond to unusual behaviour in a time series. Additionally, the result hints at the potential of the method in correcting other hydrologic time series.

  15. Occurrence of CPPopt Values in Uncorrelated ICP and ABP Time Series.

    Science.gov (United States)

    Cabeleira, M; Czosnyka, M; Liu, X; Donnelly, J; Smielewski, P

    2018-01-01

    Optimal cerebral perfusion pressure (CPPopt) is a concept that uses the pressure reactivity (PRx)-CPP relationship over a given period to find a value of CPP at which PRx shows best autoregulation. It has been proposed that this relationship be modelled by a U-shaped curve, where the minimum is interpreted as being the CPP value that corresponds to the strongest autoregulation. Owing to the nature of the calculation and the signals involved in it, the occurrence of CPPopt curves generated by non-physiological variations of intracranial pressure (ICP) and arterial blood pressure (ABP), termed here "false positives", is possible. Such random occurrences would artificially increase the yield of CPPopt values and decrease the reliability of the methodology.In this work, we studied the probability of the random occurrence of false-positives and we compared the effect of the parameters used for CPPopt calculation on this probability. To simulate the occurrence of false-positives, uncorrelated ICP and ABP time series were generated by destroying the relationship between the waves in real recordings. The CPPopt algorithm was then applied to these new series and the number of false-positives was counted for different values of the algorithm's parameters. The percentage of CPPopt curves generated from uncorrelated data was demonstrated to be 11.5%. This value can be minimised by tuning some of the calculation parameters, such as increasing the calculation window and increasing the minimum PRx span accepted on the curve.

  16. Extreme events in total ozone over Arosa: Application of extreme value theory and fingerprints of atmospheric dynamics and chemistry and their effects on mean values and long-term changes

    Science.gov (United States)

    Rieder, Harald E.; Staehelin, Johannes; Maeder, Jörg A.; Peter, Thomas; Ribatet, Mathieu; Davison, Anthony C.; Stübi, Rene; Weihs, Philipp; Holawe, Franz

    2010-05-01

    ón, Mt. Pinatubo). Furthermore, atmospheric loading in ozone depleting substances lead to a continuous modification of column ozone in the northern hemisphere also with respect to extreme values (partly again in connection with polar vortex contributions). It is shown that application of extreme value theory allows the identification of many more such fingerprints than conventional time series analysis of annual and seasonal mean values. Especially, the analysis shows the strong influence of dynamics, revealing that even moderate ENSO and NAO events have a discernible effect on total ozone (Rieder et al., 2010b). Overall the presented new extremes concept provides new information on time series properties, variability, trends and the influence of dynamics and chemistry, complementing earlier analyses focusing only on monthly (or annual) mean values. References: Coles, S.: An Introduction to Statistical Modeling of Extreme Values, Springer Series in Statistics, ISBN:1852334592, Springer, Berlin, 2001. Ribatet, M.: POT: Modelling peaks over a threshold, R News, 7, 34-36, 2007. Rieder ,H.E., Staehelin, J., Maeder, J.A., Ribatet, M., Stübi, R., Weihs, P., Holawe, F., Peter, T., and A.D., Davison (2010): Extreme events in total ozone over Arosa - Part I: Application of extreme value theory, to be submitted to ACPD. Rieder, H.E., Staehelin, J., Maeder, J.A., Ribatet, M., Stübi, R., Weihs, P., Holawe, F., Peter, T., and A.D., Davison (2010): Extreme events in total ozone over Arosa - Part II: Fingerprints of atmospheric dynamics and chemistry and effects on mean values and long-term changes, to be submitted to ACPD. Staehelin, J., Renaud, A., Bader, J., McPeters, R., Viatte, P., Hoegger, B., Bugnion, V., Giroud, M., and Schill, H.: Total ozone series at Arosa (Switzerland): Homogenization and data comparison, J. Geophys. Res., 103(D5), 5827-5842, doi:10.1029/97JD02402, 1998a. Staehelin, J., Kegel, R., and Harris, N. R.: Trend analysis of the homogenized total ozone series of

  17. The Ethics of Biomedical Big Data : Brent Daniel Mittelstadt and Luciano Floridi, eds. 2016, Springer International Publishing (Cham, Switzerland, 978-3-319-33523-0, 480 pp.).

    Science.gov (United States)

    Mason, Paul H

    2017-12-01

    The availability of diverse sources of data related to health and illness from various types of modern communication technology presents the possibility of augmenting medical knowledge, clinical care, and the patient experience. New forms of data collection and analysis will undoubtedly transform epidemiology, public health, and clinical practice, but what ethical considerations come in to play? With a view to analysing the ethical and regulatory dimensions of burgeoning forms of biomedical big data, Brent Daniel Mittelstadt and Luciano Floridi have brought together thirty scholars in an edited volume that forms part of Springer's Law, Governance and Technology book series in a collection titled The Ethics of Biomedical Big Data. With eighteen chapters partitioned into six carefully devised sections, this volume engages with core theoretical, ethical, and regulatory challenges posed by biomedical big data.

  18. How to recover an L-series from its values at almost all positive ...

    Indian Academy of Sciences (India)

    Abstract. We define a class of analytic functions which can be obtained from their values at almost all positive integers by a canonical interpolation procedure. All the usual L-functions belong to this class which is interesting in view of the extensive investigations of special values of motivic L-series. A number of classical ...

  19. The World's Approach toward Publishing in Springer and Elsevier's APC-Funded Open Access Journals

    Science.gov (United States)

    Sotudeh, Hajar; Ghasempour, Zahra

    2018-01-01

    Purpose: The present study explored tendencies of the world's countries--at individual and scientific development levels--toward publishing in APC-funded open access journals. Design/Methodology/Approach: Using a bibliometric method, it studied OA and NOA articles issued in Springer and Elsevier's APC journals? during 2007-2011. The data were…

  20. A cluster merging method for time series microarray with production values.

    Science.gov (United States)

    Chira, Camelia; Sedano, Javier; Camara, Monica; Prieto, Carlos; Villar, Jose R; Corchado, Emilio

    2014-09-01

    A challenging task in time-course microarray data analysis is to cluster genes meaningfully combining the information provided by multiple replicates covering the same key time points. This paper proposes a novel cluster merging method to accomplish this goal obtaining groups with highly correlated genes. The main idea behind the proposed method is to generate a clustering starting from groups created based on individual temporal series (representing different biological replicates measured in the same time points) and merging them by taking into account the frequency by which two genes are assembled together in each clustering. The gene groups at the level of individual time series are generated using several shape-based clustering methods. This study is focused on a real-world time series microarray task with the aim to find co-expressed genes related to the production and growth of a certain bacteria. The shape-based clustering methods used at the level of individual time series rely on identifying similar gene expression patterns over time which, in some models, are further matched to the pattern of production/growth. The proposed cluster merging method is able to produce meaningful gene groups which can be naturally ranked by the level of agreement on the clustering among individual time series. The list of clusters and genes is further sorted based on the information correlation coefficient and new problem-specific relevant measures. Computational experiments and results of the cluster merging method are analyzed from a biological perspective and further compared with the clustering generated based on the mean value of time series and the same shape-based algorithm.

  1. Autoregressive-model-based missing value estimation for DNA microarray time series data.

    Science.gov (United States)

    Choong, Miew Keen; Charbit, Maurice; Yan, Hong

    2009-01-01

    Missing value estimation is important in DNA microarray data analysis. A number of algorithms have been developed to solve this problem, but they have several limitations. Most existing algorithms are not able to deal with the situation where a particular time point (column) of the data is missing entirely. In this paper, we present an autoregressive-model-based missing value estimation method (ARLSimpute) that takes into account the dynamic property of microarray temporal data and the local similarity structures in the data. ARLSimpute is especially effective for the situation where a particular time point contains many missing values or where the entire time point is missing. Experiment results suggest that our proposed algorithm is an accurate missing value estimator in comparison with other imputation methods on simulated as well as real microarray time series datasets.

  2. Mean value estimates of the error terms of Lehmer problem

    Indian Academy of Sciences (India)

    Mean value estimates of the error terms of Lehmer problem. DONGMEI REN1 and YAMING ... For further properties of N(a,p) in [6], he studied the mean square value of the error term. E(a, p) = N(a,p) − 1. 2 (p − 1) ..... [1] Apostol Tom M, Introduction to Analytic Number Theory (New York: Springer-Verlag). (1976). [2] Guy R K ...

  3. Comparison of missing value imputation methods in time series: the case of Turkish meteorological data

    Science.gov (United States)

    Yozgatligil, Ceylan; Aslan, Sipan; Iyigun, Cem; Batmaz, Inci

    2013-04-01

    This study aims to compare several imputation methods to complete the missing values of spatio-temporal meteorological time series. To this end, six imputation methods are assessed with respect to various criteria including accuracy, robustness, precision, and efficiency for artificially created missing data in monthly total precipitation and mean temperature series obtained from the Turkish State Meteorological Service. Of these methods, simple arithmetic average, normal ratio (NR), and NR weighted with correlations comprise the simple ones, whereas multilayer perceptron type neural network and multiple imputation strategy adopted by Monte Carlo Markov Chain based on expectation-maximization (EM-MCMC) are computationally intensive ones. In addition, we propose a modification on the EM-MCMC method. Besides using a conventional accuracy measure based on squared errors, we also suggest the correlation dimension (CD) technique of nonlinear dynamic time series analysis which takes spatio-temporal dependencies into account for evaluating imputation performances. Depending on the detailed graphical and quantitative analysis, it can be said that although computational methods, particularly EM-MCMC method, are computationally inefficient, they seem favorable for imputation of meteorological time series with respect to different missingness periods considering both measures and both series studied. To conclude, using the EM-MCMC algorithm for imputing missing values before conducting any statistical analyses of meteorological data will definitely decrease the amount of uncertainty and give more robust results. Moreover, the CD measure can be suggested for the performance evaluation of missing data imputation particularly with computational methods since it gives more precise results in meteorological time series.

  4. Different methods for analysing and imputation missing values in wind speed series; La problematica de la calidad de la informacion en series de velocidad del viento-metodologias de analisis y imputacion de datos faltantes

    Energy Technology Data Exchange (ETDEWEB)

    Ferreira, A. M.

    2004-07-01

    This study concerns about different methods for analysing and imputation missing values in wind speed series. The algorithm EM and a methodology derivated from the sequential hot deck have been utilized. Series with missing values imputed are compared with original and complete series, using several criteria, such the wind potential; and appears to exist a significant goodness of fit between the estimates and real values. (Author)

  5. Protesta dei ricercatori contro gli editori: politiche di prezzo insostenibili per la ricerca. Nel mirino Elsevier e Springer

    Directory of Open Access Journals (Sweden)

    Maria Chiara Pievatolo

    2012-02-01

    Full Text Available Ricopio, per chi non segue il nostro servizio Twitter, la notizia uscita ieri su Ciber Newsletter, su PLEIADI e altrove.  I mostruosi margini di profitto delle multinazionali dell’editoria scientifica sono riportati qui. Aggiornamento 2/2/2012:  la risposta di Springer è visibile qui; quella di Elsevier qui.  Qui c’è un’analisi dei loro argomenti. Aggiornamento 3/2/2012:  l’Economist si [...

  6. Primary seborrhoea in English springer spaniels: a retrospective study of 14 cases.

    Science.gov (United States)

    Scott, D W; Miller, W H

    1996-04-01

    Primary seborrhoea was diagnosed in 14 English springer spaniels over a 17-year period. Seven of the dogs developed clinical signs by two years of age. The dermatosis began as a generalised non-pruritic dry scaling which gradually worsened. Some dogs remained in this dry (seborrhoea sicca) stage, but in most cases the dermatosis became greasy and inflamed (seborrhoea oleosa and seborrhoeic dermatitis). Eight of the dogs suffered from recurrent episodes of superficial or deep bacterial pyoderma. Histological findings in skin biopsy specimens included marked orthokeratotic hyperkeratosis of surface and infundibular epithelium, papillomatosis, parakeratotic capping of the papillae, and superficial perivascular dermatitis in which lymphocytes and mast cells were prominent. The dogs with seborrhoea sicca responded more satisfactorily to therapy with topical emollient-humectant agents or oral omega-3/omega-6 fatty acid supplementation. Dogs with seborrhoea oleosa and seborrhoeic dermatitis did not respond satisfactorily to topical therapy. One dog, however, responded well to etretinate and omega-3/omega-6 fatty acid administration. No dog was cured.

  7. From Fourier Series to Rapidly Convergent Series for Zeta(3)

    DEFF Research Database (Denmark)

    Scheufens, Ernst E

    2011-01-01

    The article presents a mathematical study which investigates the exact values of the Riemann zeta (ζ) function. It states that exact values can be determined from Fourier series for periodic versions of even power functions. It notes that using power series for logarithmic functions on this such ......The article presents a mathematical study which investigates the exact values of the Riemann zeta (ζ) function. It states that exact values can be determined from Fourier series for periodic versions of even power functions. It notes that using power series for logarithmic functions...

  8. Optimal Value of Series Capacitors for Uniform Field Distribution in Transmission Line MRI Coils

    DEFF Research Database (Denmark)

    Zhurbenko, Vitaliy

    2016-01-01

    Transmission lines are often used as coils in high field magnetic resonance imaging (MRI). Due to the distributed nature of transmission lines, coils based on them produce inhomogeneous field. This work investigates application of series capacitors to improve field homogeneity along the coil....... The equations for optimal values of evenly distributed capacitors are derived and expressed in terms of the implemented transmission line parameters.The achieved magnetic field homogeneity is estimated under quasistatic approximation and compared to the regular transmission line resonator. Finally, a more...... practical case of a microstrip line coil with two series capacitors is considered....

  9. U. Blossing, G. Imsen & L. Moos (eds., The Nordic Education Model: ´A School for All´ Encounters Noe-Liberal Policy (Dordrecht: Springer, 2014

    Directory of Open Access Journals (Sweden)

    Þorlákur Axel Jónsson

    2016-03-01

    Full Text Available Book review of: Blossing, G. Imsen & L. Moos (eds, The Nordic Education Model: ´A School for All´ Encounters Noe-Liberal Policy (Dordrecht: Springer, Policy Implications of Research in Education 1, 2014

  10. Measure-valued solutions to the complete Euler system revisited

    Czech Academy of Sciences Publication Activity Database

    Březina, J.; Feireisl, Eduard

    2018-01-01

    Roč. 69, č. 3 (2018), č. článku 57. ISSN 0044-2275 EU Projects: European Commission(XE) 320078 - MATHEF Institutional support: RVO:67985840 Keywords : Euler system * measure-valued solution * vanishing dissipation limit Subject RIV: BA - General Mathematics OBOR OECD: Pure mathematics Impact factor: 1.687, year: 2016 https://link.springer.com/article/10.1007/s00033-018-0951-8

  11. Book review: Rare earth elements—A new approach to the nexus of supply, demand and use: exemplified along the use of neodymium in permanent magnets

    Science.gov (United States)

    Van Gosen, Bradley S.

    2015-01-01

    This book is part of the “Springer Theses” published by Springer, a book series designed to highlight and share outstanding Ph.D. research. As explained by Springer (on the second page), this series “brings together a selection of the very best Ph.D. theses from around the world and across the physical sciences.” I admire Springer for this endeavor because a large number of exceptional theses are never published, and thus are difficult to obtain, while most of those that are published are presented only in highly condensed form as journal articles.

  12. The focal boundary value problem for strongly singular higher-order nonlinear functional-differential equations

    Czech Academy of Sciences Publication Activity Database

    Mukhigulashvili, Sulkhan; Půža, B.

    2015-01-01

    Roč. 2015, January (2015), s. 17 ISSN 1687-2770 Institutional support: RVO:67985840 Keywords : higher order nonlinear functional-differential equations * two-point right-focal boundary value problem * strong singularity Subject RIV: BA - General Mathematics Impact factor: 0.642, year: 2015 http://link.springer.com/article/10.1186%2Fs13661-014-0277-1

  13. Absolute continuity for operator valued completely positive maps on C∗-algebras

    Science.gov (United States)

    Gheondea, Aurelian; Kavruk, Ali Şamil

    2009-02-01

    Motivated by applicability to quantum operations, quantum information, and quantum probability, we investigate the notion of absolute continuity for operator valued completely positive maps on C∗-algebras, previously introduced by Parthasarathy [in Athens Conference on Applied Probability and Time Series Analysis I (Springer-Verlag, Berlin, 1996), pp. 34-54]. We obtain an intrinsic definition of absolute continuity, we show that the Lebesgue decomposition defined by Parthasarathy is the maximal one among all other Lebesgue-type decompositions and that this maximal Lebesgue decomposition does not depend on the jointly dominating completely positive map, we obtain more flexible formulas for calculating the maximal Lebesgue decomposition, and we point out the nonuniqueness of the Lebesgue decomposition as well as a sufficient condition for uniqueness. In addition, we consider Radon-Nikodym derivatives for absolutely continuous completely positive maps that, in general, are unbounded positive self-adjoint operators affiliated to a certain von Neumann algebra, and we obtain a spectral approximation by bounded Radon-Nikodym derivatives. An application to the existence of the infimum of two completely positive maps is indicated, and formulas in terms of Choi's matrices for the Lebesgue decomposition of completely positive maps in matrix algebras are obtained.

  14. 31 CFR 351.33 - What are interest rates and redemption values for Series EE bonds issued May 1, 1997, through...

    Science.gov (United States)

    2010-07-01

    ... 31 Money and Finance: Treasury 2 2010-07-01 2010-07-01 false What are interest rates and... Series Ee Savings Bonds with Issue Dates of May 1, 1997, Through April 1, 2005 § 351.33 What are interest rates and redemption values for Series EE bonds issued May 1, 1997, through April 1, 2005, during an...

  15. News from the Library: Trial access to Springer Engineering e-books - test it and let us know!

    CERN Multimedia

    CERN Library

    2012-01-01

    The ambition of a state-of-the-art research library is obviously to shape its collections to meet the needs of the different communities of readers that it serves.   To this end, we try to expand our (e-)book collections to provide better coverage of subject areas where such a development is needed. The good news is that the CERN community now has the opportunity to access the whole collection of engineering e-books published by Springer between 2005 and 2012. The trial period ends on 30 November 2012 and will help us to monitor usage and better shape our collections accordingly. This valuable collection is available and searchable here. Please send questions and feedback to library.desk@cern.ch.

  16. Time-Critical Cooperative Path Following of Multiple UAVs over Time-Varying Networks

    Science.gov (United States)

    2011-01-01

    Notes in Control and Information Systems Series (K. Y. Pettersen, T. Gravdahl, and H. Nijmeijer, Eds.). Springer-Verlag, 2006. 29M. Breivik , V...Information Systems Series (K. Y. Pettersen, T. Gravdahl, and H. Nijmeijer, Eds.). Springer-Verlag, 2006. 31M. Breivik , E. Hovstein, and T. I. Fossen. Ship

  17. German authors on Estonian minority rights : Selbstbestimmungsrecht und Minderheitenschutz in Estland, by Carmen Thiele. Berlin : Springer, 1999 ; Das Recht der nationalen Minderheiten in Osteuropa, edited by Georg Brunner and Boris Meissner. Berlin

    Index Scriptorium Estoniae

    Annus, Taavi, 1977-

    2004-01-01

    Arvustus: Thiele, Carmen. Selbstbestimmungsrecht und Minderheitenschutz in Estland. Berlin : Springer, 1999 ; Das Recht der nationalen Minderheiten in Osteuropa. Berlin : Berlin Verlag Arno Spitz, 1999

  18. Non-linear time series extreme events and integer value problems

    CERN Document Server

    Turkman, Kamil Feridun; Zea Bermudez, Patrícia

    2014-01-01

    This book offers a useful combination of probabilistic and statistical tools for analyzing nonlinear time series. Key features of the book include a study of the extremal behavior of nonlinear time series and a comprehensive list of nonlinear models that address different aspects of nonlinearity. Several inferential methods, including quasi likelihood methods, sequential Markov Chain Monte Carlo Methods and particle filters, are also included so as to provide an overall view of the available tools for parameter estimation for nonlinear models. A chapter on integer time series models based on several thinning operations, which brings together all recent advances made in this area, is also included. Readers should have attended a prior course on linear time series, and a good grasp of simulation-based inferential methods is recommended. This book offers a valuable resource for second-year graduate students and researchers in statistics and other scientific areas who need a basic understanding of nonlinear time ...

  19. Decision rules for decision tables with many-valued decisions

    KAUST Repository

    Chikalov, Igor

    2011-01-01

    In the paper, authors presents a greedy algorithm for construction of exact and partial decision rules for decision tables with many-valued decisions. Exact decision rules can be \\'over-fitted\\', so instead of exact decision rules with many attributes, it is more appropriate to work with partial decision rules with smaller number of attributes. Based on results for set cover problem authors study bounds on accuracy of greedy algorithm for exact and partial decision rule construction, and complexity of the problem of minimization of decision rule length. © 2011 Springer-Verlag.

  20. Application of power series to the solution of the boundary value problem for a second order nonlinear differential equation

    International Nuclear Information System (INIS)

    Semenova, V.N.

    2016-01-01

    A boundary value problem for a nonlinear second order differential equation has been considered. A numerical method has been proposed to solve this problem using power series. Results of numerical experiments have been presented in the paper [ru

  1. Most probable dimension value and most flat interval methods for automatic estimation of dimension from time series

    International Nuclear Information System (INIS)

    Corana, A.; Bortolan, G.; Casaleggio, A.

    2004-01-01

    We present and compare two automatic methods for dimension estimation from time series. Both methods, based on conceptually different approaches, work on the derivative of the bi-logarithmic plot of the correlation integral versus the correlation length (log-log plot). The first method searches for the most probable dimension values (MPDV) and associates to each of them a possible scaling region. The second one searches for the most flat intervals (MFI) in the derivative of the log-log plot. The automatic procedures include the evaluation of the candidate scaling regions using two reliability indices. The data set used to test the methods consists of time series from known model attractors with and without the addition of noise, structured time series, and electrocardiographic signals from the MIT-BIH ECG database. Statistical analysis of results was carried out by means of paired t-test, and no statistically significant differences were found in the large majority of the trials. Consistent results are also obtained dealing with 'difficult' time series. In general for a more robust and reliable estimate, the use of both methods may represent a good solution when time series from complex systems are analyzed. Although we present results for the correlation dimension only, the procedures can also be used for the automatic estimation of generalized q-order dimensions and pointwise dimension. We think that the proposed methods, eliminating the need of operator intervention, allow a faster and more objective analysis, thus improving the usefulness of dimension analysis for the characterization of time series obtained from complex dynamical systems

  2. Construction of α-decision trees for tables with many-valued decisions

    KAUST Repository

    Moshkov, Mikhail; Zielosko, Beata

    2011-01-01

    The paper is devoted to the study of greedy algorithm for construction of approximate decision trees (α-decision trees). This algorithm is applicable to decision tables with many-valued decisions where each row is labeled with a set of decisions. For a given row, we should find a decision from the set attached to this row. We consider bound on the number of algorithm steps, and bound on the algorithm accuracy relative to the depth of decision trees. © 2011 Springer-Verlag.

  3. Correction to: Comparing the UK EQ-5D-3L and English EQ-5D-5L Value Sets

    NARCIS (Netherlands)

    B. Mulhern (Brendan); Feng, Y. (Yan); K.K. Shah (Koonal); M.F. Janssen (Bas); M. Herdman (Michael); van Hout, B. (Ben); N. Devlin (Nancy)

    2018-01-01

    textabstractThe article Comparing the UK EQ-5D-3L and English EQ-5D-5L Value Sets, written by Brendan Mulhern, Yan Feng, Koonal Shah, Mathieu F. Janssen, Michael Herdman, Ben van Hout, Nancy Devlin was originally published electronically on the publisher’s internet portal (currently SpringerLink) on

  4. A GA based penalty function technique for solving constrained redundancy allocation problem of series system with interval valued reliability of components

    Science.gov (United States)

    Gupta, R. K.; Bhunia, A. K.; Roy, D.

    2009-10-01

    In this paper, we have considered the problem of constrained redundancy allocation of series system with interval valued reliability of components. For maximizing the overall system reliability under limited resource constraints, the problem is formulated as an unconstrained integer programming problem with interval coefficients by penalty function technique and solved by an advanced GA for integer variables with interval fitness function, tournament selection, uniform crossover, uniform mutation and elitism. As a special case, considering the lower and upper bounds of the interval valued reliabilities of the components to be the same, the corresponding problem has been solved. The model has been illustrated with some numerical examples and the results of the series redundancy allocation problem with fixed value of reliability of the components have been compared with the existing results available in the literature. Finally, sensitivity analyses have been shown graphically to study the stability of our developed GA with respect to the different GA parameters.

  5. Hematocrit and plasma osmolality values of young-of-year shortnose sturgeon following acute exposures to combinations of salinity and temperature

    Science.gov (United States)

    Ziegeweid, J.R.; Black, M.C.

    2010-01-01

    Little is known about the physiological capabilities of young-of-year (YOY) shortnose sturgeon. In this study, plasma osmolality and hematocrit values were measured for YOY shortnose sturgeon following 48-h exposures to 12 different combinations of salinity and temperature. Hematocrit levels varied significantly with temperature and age, and plasma osmolalities varied significantly with salinity and age. Plasma osmolality and hematocrit values were similar to previously published values for other sturgeons of similar age and size in similar treatment conditions. ?? 2010 Springer Science+Business Media B.V.

  6. CERN Book Fair 2010 - Events and book presentations

    CERN Multimedia

    CERN Library

    2010-01-01

    A series of events and book presentations is scheduled for the 2010 CERN Book Fair.   -Springer will present its new products and services (eBooks, MyCopy Softcover Editions, SpringerBriefs, and a new physics journal: "Historical Perspectives on Contemporary Physics") and SpringerMaterials, the electronic version of the Landolt Boernstein book series. -Wiley will present two books: "A History of International Research Networking : The People who Made it Happen", edited by B. Bressan and H. Davies, and "Field computation for accelerator magnets : analytical and numerical methods for electromagnetic design and optimization" by S. Russenschuck. -Finally, World Scientific will present the series "Reviews of Accelerator Science and Technology (RAST)". The calendar is available here: http://indico.cern.ch/conferenceDisplay.py?confId=105651      

  7. Detection of Outliers and Imputing of Missing Values for Water Quality UV-VIS Absorbance Time Series

    OpenAIRE

    Plazas-Nossa, Leonardo; Ávila Angulo, Miguel Antonio; Torres, Andrés

    2017-01-01

    Context:The UV-Vis absorbance collection using online optical captors for water quality detection may yield outliers and/or missing values. Therefore, pre-processing to correct these anomalies is required to improve the analysis of monitoring data. The aim of this study is to propose a method to detect outliers as well as to fill-in the gaps in time series. Method:Outliers are detected using Winsorising procedure and the application of the Discrete Fourier Transform (DFT) and the Inverse of F...

  8. Real time alpha value measurement with Feynman-α method utilizing time series data acquisition on low enriched uranium system

    International Nuclear Information System (INIS)

    Tonoike, Kotaro; Yamamoto, Toshihiro; Watanabe, Shoichi; Miyoshi, Yoshinori

    2003-01-01

    As a part of the development of a subcriticality monitoring system, a system which has a time series data acquisition function of detector signals and a real time evaluation function of alpha value with the Feynman-alpha method was established, with which the kinetic parameter (alpha value) was measured at the STACY heterogeneous core. The Hashimoto's difference filter was implemented in the system, which enables the measurement at a critical condition. The measurement result of the new system agreed with the pulsed neutron method. (author)

  9. Algorithms for Calculating Alternating Infinite Series

    International Nuclear Information System (INIS)

    Garcia, Hector Luna; Garcia, Luz Maria

    2015-01-01

    This paper are presented novel algorithms for exact limits of a broad class of infinite alternating series. Many of these series are found in physics and other branches of science and their exact values found for us are in complete agreement with the values obtained by other authors. Finally, these simple methods are very powerful in calculating the limits of many series as shown by the examples

  10. Uncertainties of the 50-year wind from short time series using generalized extreme value distribution and generalized Pareto distribution

    DEFF Research Database (Denmark)

    Larsén, Xiaoli Guo; Mann, Jakob; Rathmann, Ole

    2015-01-01

    This study examines the various sources to the uncertainties in the application of two widely used extreme value distribution functions, the generalized extreme value distribution (GEVD) and the generalized Pareto distribution (GPD). The study is done through the analysis of measurements from...... as a guideline for applying GEVD and GPD to wind time series of limited length. The data analysis shows that, with reasonable choice of relevant parameters, GEVD and GPD give consistent estimates of the return winds. For GEVD, the base period should be chosen in accordance with the occurrence of the extreme wind...

  11. Introduction to Time Series Modeling

    CERN Document Server

    Kitagawa, Genshiro

    2010-01-01

    In time series modeling, the behavior of a certain phenomenon is expressed in relation to the past values of itself and other covariates. Since many important phenomena in statistical analysis are actually time series and the identification of conditional distribution of the phenomenon is an essential part of the statistical modeling, it is very important and useful to learn fundamental methods of time series modeling. Illustrating how to build models for time series using basic methods, "Introduction to Time Series Modeling" covers numerous time series models and the various tools f

  12. Harmonic Series Meets Fibonacci Sequence

    Science.gov (United States)

    Chen, Hongwei; Kennedy, Chris

    2012-01-01

    The terms of a conditionally convergent series may be rearranged to converge to any prescribed real value. What if the harmonic series is grouped into Fibonacci length blocks? Or the harmonic series is arranged in alternating Fibonacci length blocks? Or rearranged and alternated into separate blocks of even and odd terms of Fibonacci length?

  13. Unique solvability of a non-linear non-local boundary-value problem for systems of non-linear functional differential equations

    Czech Academy of Sciences Publication Activity Database

    Dilna, N.; Rontó, András

    2010-01-01

    Roč. 60, č. 3 (2010), s. 327-338 ISSN 0139-9918 R&D Projects: GA ČR(CZ) GA201/06/0254 Institutional research plan: CEZ:AV0Z10190503 Keywords : non-linear boundary value-problem * functional differential equation * non-local condition * unique solvability * differential inequality Subject RIV: BA - General Mathematics Impact factor: 0.316, year: 2010 http://link.springer.com/article/10.2478%2Fs12175-010-0015-9

  14. Nordic Noir Production Values

    DEFF Research Database (Denmark)

    Waade, Anne Marit; Jensen, Pia Majbritt

    2013-01-01

    In this article the authors argue that Nordic noir constitutes a set of production values utilised and conceptualised to make Danish television series attractive in the international market. The idea of production values is embedded into a media industrial context where market principles of target...... by relating the specific Nordic noir production values present in the two series to changing conditions in Danish television drama production, in particular the internationalisation of DR’s Drama Division....

  15. Building Chaotic Model From Incomplete Time Series

    Science.gov (United States)

    Siek, Michael; Solomatine, Dimitri

    2010-05-01

    This paper presents a number of novel techniques for building a predictive chaotic model from incomplete time series. A predictive chaotic model is built by reconstructing the time-delayed phase space from observed time series and the prediction is made by a global model or adaptive local models based on the dynamical neighbors found in the reconstructed phase space. In general, the building of any data-driven models depends on the completeness and quality of the data itself. However, the completeness of the data availability can not always be guaranteed since the measurement or data transmission is intermittently not working properly due to some reasons. We propose two main solutions dealing with incomplete time series: using imputing and non-imputing methods. For imputing methods, we utilized the interpolation methods (weighted sum of linear interpolations, Bayesian principle component analysis and cubic spline interpolation) and predictive models (neural network, kernel machine, chaotic model) for estimating the missing values. After imputing the missing values, the phase space reconstruction and chaotic model prediction are executed as a standard procedure. For non-imputing methods, we reconstructed the time-delayed phase space from observed time series with missing values. This reconstruction results in non-continuous trajectories. However, the local model prediction can still be made from the other dynamical neighbors reconstructed from non-missing values. We implemented and tested these methods to construct a chaotic model for predicting storm surges at Hoek van Holland as the entrance of Rotterdam Port. The hourly surge time series is available for duration of 1990-1996. For measuring the performance of the proposed methods, a synthetic time series with missing values generated by a particular random variable to the original (complete) time series is utilized. There exist two main performance measures used in this work: (1) error measures between the actual

  16. Two-qubit Bell inequality for which positive operator-valued measurements are relevant

    International Nuclear Information System (INIS)

    Vertesi, T.; Bene, E.

    2010-01-01

    A bipartite Bell inequality is derived which is maximally violated on the two-qubit state space if measurements describable by positive operator valued measure (POVM) elements are allowed, rather than restricting the possible measurements to projective ones. In particular, the presented Bell inequality requires POVMs in order to be maximally violated by a maximally entangled two-qubit state. This answers a question raised by N. Gisin [in Quantum Reality, Relativistic Causality, and Closing the Epistemic Circle: Essays in Honour of Abner Shimony, edited by W. C. Myrvold and J. Christian (Springer, The Netherlands, 2009), pp. 125-138].

  17. Euler Polynomials, Fourier Series and Zeta Numbers

    DEFF Research Database (Denmark)

    Scheufens, Ernst E

    2012-01-01

    Fourier series for Euler polynomials is used to obtain information about values of the Riemann zeta function for integer arguments greater than one. If the argument is even we recover the well-known exact values, if the argument is odd we find integral representations and rapidly convergent series....

  18. BRITS: Bidirectional Recurrent Imputation for Time Series

    OpenAIRE

    Cao, Wei; Wang, Dong; Li, Jian; Zhou, Hao; Li, Lei; Li, Yitan

    2018-01-01

    Time series are widely used as signals in many classification/regression tasks. It is ubiquitous that time series contains many missing values. Given multiple correlated time series data, how to fill in missing values and to predict their class labels? Existing imputation methods often impose strong assumptions of the underlying data generating process, such as linear dynamics in the state space. In this paper, we propose BRITS, a novel method based on recurrent neural networks for missing va...

  19. News from the Library: Landolt-Börnstein book series is now accessible online!

    CERN Multimedia

    CERN Library

    2012-01-01

    H. Landolt and R. Börnstein founded the Landolt-Börnstein physical data collection more than 125 years ago in 1883. They recognized the need for selected and easily retrievable data on the scientists’ desk. This standard work of reference occupied two volumes and 1,695 pages in 1923. Today it has grown to include around 400 paper volumes.   This raises the question: how can one search effectively for physical properties and keywords across the full text of 400 volumes, 250,000 substances and 1,200,000 citations? SpringerMaterials is the answer. It includes the content of the L-B book series and - like its print counterpart - is a fully evaluated data collection in all areas of physical sciences and engineering. It also comprises 44,000 Chemical Safety Documents (including the RoHS Restriction of use of Hazardous Substances and WEEE Waste from Electrical and Electronic Equipment). Finally, a subset of the Dortmund Data Bank Software &...

  20. Water Quality Time Series, Aggregate values, and Related Aggregate Risk Measures

    Data.gov (United States)

    U.S. Environmental Protection Agency — The excel file contains time series data of flow rates, concentrations of alachlor , atrazine, ammonia, total phosphorus, and total suspended solids observed in two...

  1. Estimating return periods of extreme values from relatively short time series of winds

    Science.gov (United States)

    Jonasson, Kristjan; Agustsson, Halfdan; Rognvaldsson, Olafur; Arfeuille, Gilles

    2013-04-01

    An important factor for determining the prospect of individual wind farm sites is the frequency of extreme winds at hub height. Here, extreme winds are defined as the value of the highest 10 minutes averaged wind speed with a 50 year return period, i.e. annual exceeding probability of 2% (Rodrigo, 2010). A frequently applied method to estimate winds in the lowest few hundred meters above ground is to extrapolate observed 10-meter winds logarithmically to higher altitudes. Recent study by Drechsel et al. (2012) showed however that this methodology is not as accurate as interpolating simulated results from the global ECMWF numerical weather prediction (NWP) model to the desired height. Observations of persistent low level jets near Colima in SW-Mexico also show that the logarithmic approach can give highly inaccurate results for some regions (Arfeuille et al., 2012). To address these shortcomings of limited, and/or poorly representative, observations and extrapolations of winds one can use NWP models to dynamically scale down relatively coarse resolution atmospheric analysis. In the case of limited computing resources one has typically to make a compromise between spatial resolution and the duration of the simulated period, both of which can limit the quality of the wind farm siting. A common method to estimate maximum winds is to fit an extreme value distribution (e.g. Gumbel, gev or Pareto) to the maximum values of each year of available data, or the tail of these values. If data are only available for a short period, e.g. 10 or 15 years, then this will give a rather inaccurate estimate. It is possible to deal with this problem by utilizing monthly or weekly maxima, but this introduces new problems: seasonal variation, autocorrelation of neighboring values, and increased discrepancy between data and fitted distribution. We introduce a new method to estimate return periods of extreme values of winds at hub height from relatively short time series of winds, simulated

  2. Fourier series and orthogonal polynomials

    CERN Document Server

    Jackson, Dunham

    2004-01-01

    This text for undergraduate and graduate students illustrates the fundamental simplicity of the properties of orthogonal functions and their developments in related series. Starting with a definition and explanation of the elements of Fourier series, the text follows with examinations of Legendre polynomials and Bessel functions. Boundary value problems consider Fourier series in conjunction with Laplace's equation in an infinite strip and in a rectangle, with a vibrating string, in three dimensions, in a sphere, and in other circumstances. An overview of Pearson frequency functions is followe

  3. Zukunft der E-Books: Innovationen und Geschäftsmodelle von Verlagen für Medizinbibliotheken an Hochschulen: Je drei Fragen von Bruno Bauer an Klaus Bahmann (Springer, Peter Gemmel (Thieme und Angelika Lex (Elsevier / The future of e-books: innovations and business plan of publishers for medical libraries at universities: 3 questions to Klaus Bahmann (Springer, Peter Gemmel (Thieme and Angelika Lex (Elsevier by Bruno Bauer

    Directory of Open Access Journals (Sweden)

    Lex, Angelika

    2010-05-01

    Full Text Available In the past years electronic books were booming at scientific libraries. E-books have established especially at medical libraries as an important part of the document supply in particular for students.The current interview with Klaus Bahmann (Springer, Peter Gemmel (Thieme and Angelika Lex (Elsevier informs about development, situation and outlook of e-books of three big scientific publishers. They spoke about the focal points in the supply and the diverse business models and the different estimation for the future development of electronic books.

  4. Efficient Algorithms for Segmentation of Item-Set Time Series

    Science.gov (United States)

    Chundi, Parvathi; Rosenkrantz, Daniel J.

    We propose a special type of time series, which we call an item-set time series, to facilitate the temporal analysis of software version histories, email logs, stock market data, etc. In an item-set time series, each observed data value is a set of discrete items. We formalize the concept of an item-set time series and present efficient algorithms for segmenting a given item-set time series. Segmentation of a time series partitions the time series into a sequence of segments where each segment is constructed by combining consecutive time points of the time series. Each segment is associated with an item set that is computed from the item sets of the time points in that segment, using a function which we call a measure function. We then define a concept called the segment difference, which measures the difference between the item set of a segment and the item sets of the time points in that segment. The segment difference values are required to construct an optimal segmentation of the time series. We describe novel and efficient algorithms to compute segment difference values for each of the measure functions described in the paper. We outline a dynamic programming based scheme to construct an optimal segmentation of the given item-set time series. We use the item-set time series segmentation techniques to analyze the temporal content of three different data sets—Enron email, stock market data, and a synthetic data set. The experimental results show that an optimal segmentation of item-set time series data captures much more temporal content than a segmentation constructed based on the number of time points in each segment, without examining the item set data at the time points, and can be used to analyze different types of temporal data.

  5. Forecasting Enrollments with Fuzzy Time Series.

    Science.gov (United States)

    Song, Qiang; Chissom, Brad S.

    The concept of fuzzy time series is introduced and used to forecast the enrollment of a university. Fuzzy time series, an aspect of fuzzy set theory, forecasts enrollment using a first-order time-invariant model. To evaluate the model, the conventional linear regression technique is applied and the predicted values obtained are compared to the…

  6. Some Characteristics Of the Financial Data Series

    Directory of Open Access Journals (Sweden)

    Gheorghe Săvoiu

    2013-05-01

    Full Text Available This paper attempts to delineate from a theoretical of view the financial data series relative to other statistical data, starting from the financial econometrics’ models and from the resulting features of the specific descriptive statistics’ analysis of these characteristic series. From the analysis of these financial data during either very short and short or medium periods of time or from the information provided by the website of the Bucharest Stock Exchange (BVB, the trend of great values of kurtosis or eccentricity and skewness or asymmetry of series appears as a characteristic tendency. During a long period of time, between 1920 and 2008, this tendency seems to be more relevant, being confirmed by an excerpt from the author’s earlier paper written in 2009, concerning the statistical Dow Jones Industrial Average Index (DJIA Index. The skewness, kurtosis and normality of data distribution analysis, using Jarque Bera test, along with the identification of residual autocorrelation or serial correlation in the presence of significant residual values and heteroskedasticity are the major evaluated aspects. Finally, the author investigates the optimal way to ensure statistical comparability inflationary and deflationary method for financial series of data, and offers a solution to the selection of the appropriate indicator from the categories of the absolute values, absolute variation of the absolute values and the relative variation of the absolute values, expressed by percentages, with the finding of the latter alternative as the best alternative in the world of financial modelling of the economic and financial processes and phenomena.

  7. Predicting chaotic time series

    International Nuclear Information System (INIS)

    Farmer, J.D.; Sidorowich, J.J.

    1987-01-01

    We present a forecasting technique for chaotic data. After embedding a time series in a state space using delay coordinates, we ''learn'' the induced nonlinear mapping using local approximation. This allows us to make short-term predictions of the future behavior of a time series, using information based only on past values. We present an error estimate for this technique, and demonstrate its effectiveness by applying it to several examples, including data from the Mackey-Glass delay differential equation, Rayleigh-Benard convection, and Taylor-Couette flow

  8. Collateral missing value imputation: a new robust missing value estimation algorithm for microarray data.

    Science.gov (United States)

    Sehgal, Muhammad Shoaib B; Gondal, Iqbal; Dooley, Laurence S

    2005-05-15

    Microarray data are used in a range of application areas in biology, although often it contains considerable numbers of missing values. These missing values can significantly affect subsequent statistical analysis and machine learning algorithms so there is a strong motivation to estimate these values as accurately as possible before using these algorithms. While many imputation algorithms have been proposed, more robust techniques need to be developed so that further analysis of biological data can be accurately undertaken. In this paper, an innovative missing value imputation algorithm called collateral missing value estimation (CMVE) is presented which uses multiple covariance-based imputation matrices for the final prediction of missing values. The matrices are computed and optimized using least square regression and linear programming methods. The new CMVE algorithm has been compared with existing estimation techniques including Bayesian principal component analysis imputation (BPCA), least square impute (LSImpute) and K-nearest neighbour (KNN). All these methods were rigorously tested to estimate missing values in three separate non-time series (ovarian cancer based) and one time series (yeast sporulation) dataset. Each method was quantitatively analyzed using the normalized root mean square (NRMS) error measure, covering a wide range of randomly introduced missing value probabilities from 0.01 to 0.2. Experiments were also undertaken on the yeast dataset, which comprised 1.7% actual missing values, to test the hypothesis that CMVE performed better not only for randomly occurring but also for a real distribution of missing values. The results confirmed that CMVE consistently demonstrated superior and robust estimation capability of missing values compared with other methods for both series types of data, for the same order of computational complexity. A concise theoretical framework has also been formulated to validate the improved performance of the CMVE

  9. Note on Integer-Valued Bilinear Time Series Models

    NARCIS (Netherlands)

    Drost, F.C.; van den Akker, R.; Werker, B.J.M.

    2007-01-01

    Summary. This note reconsiders the nonnegative integer-valued bilinear processes introduced by Doukhan, Latour, and Oraichi (2006). Using a hidden Markov argument, we extend their result of the existence of a stationary solution for the INBL(1,0,1,1) process to the class of superdiagonal INBL(p; q;

  10. Time Series Outlier Detection Based on Sliding Window Prediction

    Directory of Open Access Journals (Sweden)

    Yufeng Yu

    2014-01-01

    Full Text Available In order to detect outliers in hydrological time series data for improving data quality and decision-making quality related to design, operation, and management of water resources, this research develops a time series outlier detection method for hydrologic data that can be used to identify data that deviate from historical patterns. The method first built a forecasting model on the history data and then used it to predict future values. Anomalies are assumed to take place if the observed values fall outside a given prediction confidence interval (PCI, which can be calculated by the predicted value and confidence coefficient. The use of PCI as threshold is mainly on the fact that it considers the uncertainty in the data series parameters in the forecasting model to address the suitable threshold selection problem. The method performs fast, incremental evaluation of data as it becomes available, scales to large quantities of data, and requires no preclassification of anomalies. Experiments with different hydrologic real-world time series showed that the proposed methods are fast and correctly identify abnormal data and can be used for hydrologic time series analysis.

  11. Correlation and multifractality in climatological time series

    International Nuclear Information System (INIS)

    Pedron, I T

    2010-01-01

    Climate can be described by statistical analysis of mean values of atmospheric variables over a period. It is possible to detect correlations in climatological time series and to classify its behavior. In this work the Hurst exponent, which can characterize correlation and persistence in time series, is obtained by using the Detrended Fluctuation Analysis (DFA) method. Data series of temperature, precipitation, humidity, solar radiation, wind speed, maximum squall, atmospheric pressure and randomic series are studied. Furthermore, the multifractality of such series is analyzed applying the Multifractal Detrended Fluctuation Analysis (MF-DFA) method. The results indicate presence of correlation (persistent character) in all climatological series and multifractality as well. A larger set of data, and longer, could provide better results indicating the universality of the exponents.

  12. Visibility graph approach to exchange rate series

    Science.gov (United States)

    Yang, Yue; Wang, Jianbo; Yang, Huijie; Mang, Jingshi

    2009-10-01

    By means of a visibility graph, we investigate six important exchange rate series. It is found that the series convert into scale-free and hierarchically structured networks. The relationship between the scaling exponents of the degree distributions and the Hurst exponents obeys the analytical prediction for fractal Brownian motions. The visibility graph can be used to obtain reliable values of Hurst exponents of the series. The characteristics are explained by using the multifractal structures of the series. The exchange rate of EURO to Japanese Yen is widely used to evaluate risk and to estimate trends in speculative investments. Interestingly, the hierarchies of the visibility graphs for the exchange rate series of these two currencies are significantly weak compared with that of the other series.

  13. Book review: Three great tsunamis: Lisbon (1755), Sumatra-Andaman (2004), and Japan (2011)

    Science.gov (United States)

    Geist, Eric L.

    2014-01-01

    “Three Great Tsunamis: Lisbon (1755), Sumatra–Andaman (2004), and Japan (2011)” is published in Springer’s new series SpringerBriefs. According to Springer’s website, the SpringBriefs volumes are intended to provide “concise summaries of cutting-edge research and practical applications across a wide spectrum of fields”. Among the several categories considered for SpringerBriefs are in-depth case studies, for which this volume is most closely aligned.

  14. Hidden Markov Models for Time Series An Introduction Using R

    CERN Document Server

    Zucchini, Walter

    2009-01-01

    Illustrates the flexibility of HMMs as general-purpose models for time series data. This work presents an overview of HMMs for analyzing time series data, from continuous-valued, circular, and multivariate series to binary data, bounded and unbounded counts and categorical observations.

  15. Visual time series analysis

    DEFF Research Database (Denmark)

    Fischer, Paul; Hilbert, Astrid

    2012-01-01

    We introduce a platform which supplies an easy-to-handle, interactive, extendable, and fast analysis tool for time series analysis. In contrast to other software suits like Maple, Matlab, or R, which use a command-line-like interface and where the user has to memorize/look-up the appropriate...... commands, our application is select-and-click-driven. It allows to derive many different sequences of deviations for a given time series and to visualize them in different ways in order to judge their expressive power and to reuse the procedure found. For many transformations or model-ts, the user may...... choose between manual and automated parameter selection. The user can dene new transformations and add them to the system. The application contains efficient implementations of advanced and recent techniques for time series analysis including techniques related to extreme value analysis and filtering...

  16. Summation of Divergent Series and Zeldovich's Regularization Method

    International Nuclear Information System (INIS)

    Mur, V.D.; Pozdnyakov, S.G.; Popruzhenko, S.V.; Popov, V.S.

    2005-01-01

    A method for summing divergent series, including perturbation-theory series, is considered. This method is an analog of Zeldovich's regularization method in the theory of quasistationary states. It is shown that the method in question is more powerful than the well-known Abel and Borel methods, but that it is compatible with them (that is, it leads to the same value for the sum of a series). The constraints on the parameter domain that arise upon the removal of the regularization of divergent integrals by this method are discussed. The dynamical Stark shifts and widths of loosely bound s states in the field of a circularly polarized electromagnetic wave are calculated at various values of the Keldysh adiabaticity parameter and the multiquantum parameter

  17. DTW-APPROACH FOR UNCORRELATED MULTIVARIATE TIME SERIES IMPUTATION

    OpenAIRE

    Phan , Thi-Thu-Hong; Poisson Caillault , Emilie; Bigand , André; Lefebvre , Alain

    2017-01-01

    International audience; Missing data are inevitable in almost domains of applied sciences. Data analysis with missing values can lead to a loss of efficiency and unreliable results, especially for large missing sub-sequence(s). Some well-known methods for multivariate time series imputation require high correlations between series or their features. In this paper , we propose an approach based on the shape-behaviour relation in low/un-correlated multivariate time series under an assumption of...

  18. Introduction to partial differential equations from Fourier series to boundary-value problems

    CERN Document Server

    Broman, Arne

    2010-01-01

    This well-written, advanced-level text introduces students to Fourier analysis and some of its applications. The self-contained treatment covers Fourier series, orthogonal systems, Fourier and Laplace transforms, Bessel functions, and partial differential equations of the first and second orders. Over 260 exercises with solutions reinforce students' grasp of the material. 1970 edition.

  19. Bernoulli Polynomials, Fourier Series and Zeta Numbers

    DEFF Research Database (Denmark)

    Scheufens, Ernst E

    2013-01-01

    Fourier series for Bernoulli polynomials are used to obtain information about values of the Riemann zeta function for integer arguments greater than one. If the argument is even we recover the well-known exact values, if the argument is odd we find integral representations and rapidly convergent...

  20. The Divergence of Balanced Harmonic-Like Series

    Science.gov (United States)

    Lutzer, Carl V.; Marengo, James E.

    2006-01-01

    Consider the series [image omitted] where the value of each a[subscript n] is determined by the flip of a coin: heads on the "n"th toss will mean that a[subscript n] =1 and tails that a[subscript n] = -1. Assuming that the coin is "fair," what is the probability that this "harmonic-like" series converges? After a moment's thought, many people…

  1. Summation of divergent series and Zel'dovich's regularization method

    International Nuclear Information System (INIS)

    Mur, V.D.; Pozdnyakov, S.G.; Popruzhenko, S.V.; Popov, V.S.

    2005-01-01

    The method of summation of divergent series, including series of a perturbation theory, which is an analog of the Zel'dovich regularization procedure in the theory of quasistationary states is considered. It is shown that this method is more powerful than the well-known Abel and Borel methods, but compatible with them (i. e., gives the same value for the sum of the series). The restrictions to the range of parameters which appear after removal of the regularization of integrals by this method are discussed. The dynamical Stark shifts and widths of weakly bound s states in a field of circularly polarized electromagnetic wave are calculated at different values of the Keldysh adiabaticity parameter and multiquantum parameter [ru

  2. A multiple-scale power series method for solving nonlinear ordinary differential equations

    Directory of Open Access Journals (Sweden)

    Chein-Shan Liu

    2016-02-01

    Full Text Available The power series solution is a cheap and effective method to solve nonlinear problems, like the Duffing-van der Pol oscillator, the Volterra population model and the nonlinear boundary value problems. A novel power series method by considering the multiple scales $R_k$ in the power term $(t/R_k^k$ is developed, which are derived explicitly to reduce the ill-conditioned behavior in the data interpolation. In the method a huge value times a tiny value is avoided, such that we can decrease the numerical instability and which is the main reason to cause the failure of the conventional power series method. The multiple scales derived from an integral can be used in the power series expansion, which provide very accurate numerical solutions of the problems considered in this paper.

  3. Chaotic characteristic of electromagnetic radiation time series of coal or rock under different scales

    Energy Technology Data Exchange (ETDEWEB)

    Zhen-Tang Liu; En-Lai Zhao; En-Yuan Wang; Jing Wang [China University of Mining and Technology, Xuzhou (China). School of Safety Engineering

    2009-02-15

    Based on chaos theory, the chaotic characteristics of electromagnetic radiation time series of coal or rock under different loads was studied. The results show that the correlation of electromagnetic radiation time series of small-scale coal or rock and coal mine converges to a stable saturation value, which shows that these electromagnetic radiation time series have chaos characteristics. When there is danger of coal seam burst, the value of the saturation correlation dimension D{sub 2} of the electromagnetic radiation time series is bigger and it changes greatly; when there is no danger, its value is smaller and changes smoothly. The change of saturation correlation of electromagnetic radiation time series can be used to forecast coal or rock dynamic disasters. 11 refs., 4 figs.

  4. Layered Ensemble Architecture for Time Series Forecasting.

    Science.gov (United States)

    Rahman, Md Mustafizur; Islam, Md Monirul; Murase, Kazuyuki; Yao, Xin

    2016-01-01

    Time series forecasting (TSF) has been widely used in many application areas such as science, engineering, and finance. The phenomena generating time series are usually unknown and information available for forecasting is only limited to the past values of the series. It is, therefore, necessary to use an appropriate number of past values, termed lag, for forecasting. This paper proposes a layered ensemble architecture (LEA) for TSF problems. Our LEA consists of two layers, each of which uses an ensemble of multilayer perceptron (MLP) networks. While the first ensemble layer tries to find an appropriate lag, the second ensemble layer employs the obtained lag for forecasting. Unlike most previous work on TSF, the proposed architecture considers both accuracy and diversity of the individual networks in constructing an ensemble. LEA trains different networks in the ensemble by using different training sets with an aim of maintaining diversity among the networks. However, it uses the appropriate lag and combines the best trained networks to construct the ensemble. This indicates LEAs emphasis on accuracy of the networks. The proposed architecture has been tested extensively on time series data of neural network (NN)3 and NN5 competitions. It has also been tested on several standard benchmark time series data. In terms of forecasting accuracy, our experimental results have revealed clearly that LEA is better than other ensemble and nonensemble methods.

  5. Reviving Graduate Seminar Series through Non-Technical Presentations

    Science.gov (United States)

    Madihally, Sundararajan V.

    2011-01-01

    Most chemical engineering programs that offer M.S. and Ph.D. degrees have a common seminar series for all the graduate students. Many would agree that seminars lack student interest, leading to ineffectiveness. We questioned the possibility of adding value to the seminar series by incorporating non-technical topics that may be more important to…

  6. III–V semiconductors

    CERN Document Server

    Freyhardt, H C

    1980-01-01

    Springer-Verlag, Berlin Heidelberg, in conjunction with Springer-Verlag New York, is pleased to announce a new series: CRYSTALS Growth, Properties, and Applications The series presents critical reviews of recent developments in the field of crystal growth, properties, and applications. A substantial portion of the new series will be devoted to the theory, mechanisms, and techniques of crystal growth. Occasionally, clear, concise, complete, and tested instructions for growing crystals will be published, particularly in the case of methods and procedures that promise to have general applicability. Responding to the ever-increasing need for crystal substances in research and industry, appropriate space will be devoted to methods of crystal characterization and analysis in the broadest sense, even though reproducible results may be expected only when structures, microstructures, and composition are really known. Relations among procedures, properties, and the morphology of crystals will also be treated with refer...

  7. Genealogical series method. Hyperpolar points screen effect

    International Nuclear Information System (INIS)

    Gorbatov, A.M.

    1991-01-01

    The fundamental values of the genealogical series method -the genealogical integrals (sandwiches) have been investigated. The hyperpolar points screen effect has been found. It allows one to calculate the sandwiches for the Fermion systems with large number of particles and to ascertain the validity of the iterated-potential method as well. For the first time the genealogical-series method has been realized numerically for the central spin-independent potential

  8. On primal regularity estimates for single-valued mappings

    Czech Academy of Sciences Publication Activity Database

    Cibulka, R.; Fabian, Marián; Ioffe, A. D.

    2015-01-01

    Roč. 17, č. 1 (2015), s. 187-208 ISSN 1661-7738 R&D Projects: GA ČR(CZ) GAP201/12/0290 Institutional support: RVO:67985840 Keywords : Clarke's inverse function theorem * confinal family * contingent tangent cone * linear openness Subject RIV: BA - General Mathematics Impact factor: 0.540, year: 2015 http://link.springer.com/article/10.1007%2Fs11784-015-0240-5

  9. A summation due to Ramanujan revisited

    Indian Academy of Sciences (India)

    [1] Berndt B C, Ramanujan's Notebooks Part II (1989) (Springer). [2] Berndt B C, Lamb G and Rogers M, Two-dimensional series evaluations via the elliptic functions of Ramanujan and Jacobi, Ramanujan J. 29 (2012) 185–198. [3] Hardy G H, On the convergence of certain multiple series, Proc. Cambridge Philos. Soc.

  10. Syntheses, characterizations, and catalytic activities of mesostructured aluminophosphates with tailorable acidity assembled with various preformed zeolite nanoclusters

    KAUST Repository

    Suo, Hongri; Zeng, Shangjing; Wang, Runwei; Zhang, Zongtao; Qiu, Shilun

    2015-01-01

    © 2015, Springer Science+Business Media New York. A series of ordered hexagonal mesoporous zeolites have been successfully synthesized by the assembly of various preformed aluminosilicates zeolite (MFI, FAU, BEA etc.) with surfactants

  11. Advances in Antithetic Time Series Analysis : Separating Fact from Artifact

    Directory of Open Access Journals (Sweden)

    Dennis Ridley

    2016-01-01

    Full Text Available The problem of biased time series mathematical model parameter estimates is well known to be insurmountable. When used to predict future values by extrapolation, even a de minimis bias will eventually grow into a large bias, with misleading results. This paper elucidates how combining antithetic time series' solves this baffling problem of bias in the fitted and forecast values by dynamic bias cancellation. Instead of growing to infinity, the average error can converge to a constant. (original abstract

  12. Pseudo-random bit generator based on lag time series

    Science.gov (United States)

    García-Martínez, M.; Campos-Cantón, E.

    2014-12-01

    In this paper, we present a pseudo-random bit generator (PRBG) based on two lag time series of the logistic map using positive and negative values in the bifurcation parameter. In order to hidden the map used to build the pseudo-random series we have used a delay in the generation of time series. These new series when they are mapped xn against xn+1 present a cloud of points unrelated to the logistic map. Finally, the pseudo-random sequences have been tested with the suite of NIST giving satisfactory results for use in stream ciphers.

  13. Modelos de gestión de conflictos en serie de ficción televisiva (Conflict management models in television fiction series

    Directory of Open Access Journals (Sweden)

    Yolanda Navarro-Abal

    2012-12-01

    Full Text Available Television fiction series sometimes generate an unreal vision of life, especially among young people, becoming a mirror in which they can see themselves reflected. The series become models of values, attitudes, skills and behaviours that tend to be imitated by some viewers. The aim of this study was to analyze the conflict management behavioural styles presented by the main characters of television fiction series. Thus, we evaluated the association between these styles and the age and sex of the main characters, as well as the nationality and genre of the fiction series. 16 fiction series were assessed by selecting two characters of both sexes from each series. We adapted the Rahim Organizational Conflict Inventory-II for observing and recording the data. The results show that there is no direct association between the conflict management behavioural styles presented in the drama series and the sex of the main characters. However, associations were found between these styles and the age of the characters and the genre of the fiction series.

  14. Permutation entropy of finite-length white-noise time series.

    Science.gov (United States)

    Little, Douglas J; Kane, Deb M

    2016-08-01

    Permutation entropy (PE) is commonly used to discriminate complex structure from white noise in a time series. While the PE of white noise is well understood in the long time-series limit, analysis in the general case is currently lacking. Here the expectation value and variance of white-noise PE are derived as functions of the number of ordinal pattern trials, N, and the embedding dimension, D. It is demonstrated that the probability distribution of the white-noise PE converges to a χ^{2} distribution with D!-1 degrees of freedom as N becomes large. It is further demonstrated that the PE variance for an arbitrary time series can be estimated as the variance of a related metric, the Kullback-Leibler entropy (KLE), allowing the qualitative N≫D! condition to be recast as a quantitative estimate of the N required to achieve a desired PE calculation precision. Application of this theory to statistical inference is demonstrated in the case of an experimentally obtained noise series, where the probability of obtaining the observed PE value was calculated assuming a white-noise time series. Standard statistical inference can be used to draw conclusions whether the white-noise null hypothesis can be accepted or rejected. This methodology can be applied to other null hypotheses, such as discriminating whether two time series are generated from different complex system states.

  15. Reviews in fluorescence 2007

    CERN Document Server

    Lakowicz, Joseph R; Geddes, Chris D

    2009-01-01

    This fourth volume in the Springer series summarizes the year's progress in fluorescence, with authoritative analytical reviews specialized enough for professional researchers, yet also appealing to a wider audience of scientists in related fields.

  16. Effects of a homologous series of linear alcohol ethoxylate surfactants on fathead minnow early life stages.

    Science.gov (United States)

    Lizotte, R E; Wong, D C; Dorn, P B; Rodgers, J H

    1999-11-01

    Effects of a homologous series of three primarily linear alcohol ethoxylate surfactants were studied in laboratory flow-through 28-day early-life-stage tests with fathead minnow (Pimephales promelas Rafinesque). Surfactants were a C(9-11), C(12-13), and C(14-15) with an average of 6, 6.5, and 7 ethylene oxide units per mole of alcohol, respectively. Average measured surfactant recoveries were 103%, 81%, and 79% of nominal concentrations for the C(9-11) EO 6, C(12-13) EO 6.5, and C(14-15) EO 7 studies, respectively. Embryo survival at 48 h was not adversely affected at any of the concentrations tested. Impaired hatching and deformed fry were observed only in the C(12-13) EO 6.5 study. The 28-day LC50 values were 4.87, 2.39, and 1.02 mg/L for the C(9-11) EO 6, C(12-13) EO 6.5, and C(14-15) EO 7 surfactants, respectively. The corresponding NOECs for survival were 1.01, 1.76, and 0.74 mg/L. Posthatch fry growth was more sensitive than survival for the C(12-13) EO 6.5 and C(14-15) EO 7 surfactants. Survival of posthatch fry decreased with increasing surfactant alkyl chain length. Twenty-eight-day laboratory data were compared to 96-h laboratory, 10-day laboratory and 30-day stream mesocosm data for fathead minnow previously determined for these surfactants. Survival endpoints from the different exposures were comparable and only varied within a factor of two. Similarity of results suggests that it is possible to effectively use 96-h, 10-day, or 28-day laboratory data to predict environmental effects concentrations of these surfactants for fish. http://link.springer-ny. com/link/service/journals/00244/bibs/37n4p536.html

  17. TIME SERIES ANALYSIS USING A UNIQUE MODEL OF TRANSFORMATION

    Directory of Open Access Journals (Sweden)

    Goran Klepac

    2007-12-01

    Full Text Available REFII1 model is an authorial mathematical model for time series data mining. The main purpose of that model is to automate time series analysis, through a unique transformation model of time series. An advantage of this approach of time series analysis is the linkage of different methods for time series analysis, linking traditional data mining tools in time series, and constructing new algorithms for analyzing time series. It is worth mentioning that REFII model is not a closed system, which means that we have a finite set of methods. At first, this is a model for transformation of values of time series, which prepares data used by different sets of methods based on the same model of transformation in a domain of problem space. REFII model gives a new approach in time series analysis based on a unique model of transformation, which is a base for all kind of time series analysis. The advantage of REFII model is its possible application in many different areas such as finance, medicine, voice recognition, face recognition and text mining.

  18. L-series of elliptic curves with CM by √-3

    International Nuclear Information System (INIS)

    Qiu Derong; Zhang Xianke

    2001-09-01

    Let E:y 2 =x 3 -2 4 3 3 D 2 be elliptic curves defined over the quadratic field Q(√-3). Hecke L-series attached to E are studied, formulae for the values of the L-series at s=1 are given, and the bound of 3-adic valuations of these values are obtained. These results are consistent with the predictions of the conjecture of Birch and Swinnerton-Dyer, and generalize results in recent literature about elliptic curves defined over rationals. (author)

  19. Russian State Time and Earth Rotation Service: Observations, Eop Series, Prediction

    Science.gov (United States)

    Kaufman, M.; Pasynok, S.

    2010-01-01

    Russian State Time, Frequency and Earth Rotation Service provides the official EOP data and time for use in scientific, technical and metrological works in Russia. The observations of GLONASS and GPS on 30 stations in Russia, and also the Russian and worldwide observations data of VLBI (35 stations) and SLR (20 stations) are used now. To these three series of EOP the data calculated in two other Russian analysis centers are added: IAA (VLBI, GPS and SLR series) and MCC (SLR). Joint processing of these 7 series is carried out every day (the operational EOP data for the last day and the predicted values for 50 days). The EOP values are weekly refined and systematic errors of every individual series are corrected. The combined results become accessible on the VNIIFTRI server (ftp.imvp.ru) approximately at 6h UT daily.

  20. Time Series Imputation via L1 Norm-Based Singular Spectrum Analysis

    Science.gov (United States)

    Kalantari, Mahdi; Yarmohammadi, Masoud; Hassani, Hossein; Silva, Emmanuel Sirimal

    Missing values in time series data is a well-known and important problem which many researchers have studied extensively in various fields. In this paper, a new nonparametric approach for missing value imputation in time series is proposed. The main novelty of this research is applying the L1 norm-based version of Singular Spectrum Analysis (SSA), namely L1-SSA which is robust against outliers. The performance of the new imputation method has been compared with many other established methods. The comparison is done by applying them to various real and simulated time series. The obtained results confirm that the SSA-based methods, especially L1-SSA can provide better imputation in comparison to other methods.

  1. Dissipative measure-valued solutions to the compressible Navier-Stokes system

    Czech Academy of Sciences Publication Activity Database

    Feireisl, Eduard; Gwiazda, P.; Swierczewska-Gwiazda, A.; Wiedemann, E.

    2016-01-01

    Roč. 55, č. 6 (2016), č. článku 141. ISSN 0944-2669 EU Projects: European Commission(XE) 320078 - MATHEF Institutional support: RVO:67985840 Keywords : singular limit * compressible fluid * viscous fluid Subject RIV: BA - General Mathematics Impact factor: 1.532, year: 2016 http://link.springer.com/article/10.1007/s00526-016-1089-1

  2. PhilDB: the time series database with built-in change logging

    Directory of Open Access Journals (Sweden)

    Andrew MacDonald

    2016-03-01

    Full Text Available PhilDB is an open-source time series database that supports storage of time series datasets that are dynamic; that is, it records updates to existing values in a log as they occur. PhilDB eases loading of data for the user by utilising an intelligent data write method. It preserves existing values during updates and abstracts the update complexity required to achieve logging of data value changes. It implements fast reads to make it practical to select data for analysis. Recent open-source systems have been developed to indefinitely store long-period high-resolution time series data without change logging. Unfortunately, such systems generally require a large initial installation investment before use because they are designed to operate over a cluster of servers to achieve high-performance writing of static data in real time. In essence, they have a ‘big data’ approach to storage and access. Other open-source projects for handling time series data that avoid the ‘big data’ approach are also relatively new and are complex or incomplete. None of these systems gracefully handle revision of existing data while tracking values that change. Unlike ‘big data’ solutions, PhilDB has been designed for single machine deployment on commodity hardware, reducing the barrier to deployment. PhilDB takes a unique approach to meta-data tracking; optional attribute attachment. This facilitates scaling the complexities of storing a wide variety of data. That is, it allows time series data to be loaded as time series instances with minimal initial meta-data, yet additional attributes can be created and attached to differentiate the time series instances when a wider variety of data is needed. PhilDB was written in Python, leveraging existing libraries. While some existing systems come close to meeting the needs PhilDB addresses, none cover all the needs at once. PhilDB was written to fill this gap in existing solutions. This paper explores existing time

  3. Long Series of GNSS Integrated Precipitable Water as a Climate Change Indicator

    Directory of Open Access Journals (Sweden)

    Kruczyk Michał

    2015-12-01

    Full Text Available This paper investigates information potential contained in tropospheric delay product for selected International GNSS Service (IGS stations in climatologic research. Long time series of daily averaged Integrated Precipitable Water (IPW can serve as climate indicator. The seasonal model of IPW change has been adjusted to the multi-year series (by the least square method. Author applied two modes: sinusoidal and composite (two or more oscillations. Even simple sinusoidal seasonal model (of daily IPW values series clearly represents diversity of world climates. Residuals in periods from 10 up to 17 years are searched for some long-term IPW trend – self-evident climate change indicator. Results are ambiguous: for some stations or periods IPW trends are quite clear, the following years (or the other station not visible. Method of fitting linear trend to IPW series does not influence considerably the value of linear trend. The results are mostly influenced by series length, completeness and data (e.g. meteorological quality. The longer and more homogenous IPW series, the better chance to estimate the magnitude of climatologic IPW changes.

  4. Astrochemistry and astrobiology

    CERN Document Server

    Smith, Ian W M; Leach, Sydney

    2014-01-01

    This debut volume in the new Springer series Physical Chemistry in Action, composed of expert contributions, is aimed at both novice and experienced researchers, and outlines the principles of the physical chemistry deployed in astrochemistry and astrobiology.

  5. Ultracool Dwarf Stars: Surveys, Properties, and Spectral Classification

    Science.gov (United States)

    Steele, Iain A.; Jones, Hugh R. A.

    2001-03-01

    Conference was held in Manchester, England, United Kingdom, in 2000 August. The Proceedings will be edited by H. R. A. Jones and I. A. Steele and published in the Lecture Notes in Physics Series by Springer-Verlag.

  6. [Deficiency, disability, neurology and television series].

    Science.gov (United States)

    Collado-Vázquez, Susana; Martínez-Martínez, Ariadna; Cano-de-la-Cuerda, Roberto

    2015-06-01

    The portrayal of neurological disability and deficiency on television has not always been approached in the same way, but has instead tended to reflect the standpoint taken by society with regard to these issues and how they are dealt with according to the prevailing conceptions and values at each particular time. To address the appearance of neurological pathologies in television series and to ponder on the image they have in such contexts. Deficiency and disability of neurological origin have often been depicted on television in series, telefilms and documentaries, and in a wide variety of ways. Here we examine different television series and how they have dealt with neurological pathology, its diagnosis and its treatment, as well as the figure of the healthcare professional and social-familial adaptation. Examples cited include series such as House MD, Glee, American Horror Story, Homeland or Game of Thrones. Television series are a useful tool for making some neurological pathologies better known to the public and for dispelling the myths surrounding others, provided that the pathologies are dealt with in a realistic manner, which is not always the case. More care should be taken with regard to the way in which health professionals are portrayed in television series, as it is not always done correctly and may mislead viewers, who take what they see on the TV as being real.

  7. The role of initial values in nonstationary fractional time series models

    DEFF Research Database (Denmark)

    Johansen, Søren; Nielsen, Morten Ørregaard

    We consider the nonstationary fractional model $\\Delta^{d}X_{t}=\\varepsilon _{t}$ with $\\varepsilon_{t}$ i.i.d.$(0,\\sigma^{2})$ and $d>1/2$. We derive an analytical expression for the main term of the asymptotic bias of the maximum likelihood estimator of $d$ conditional on initial values, and we...... discuss the role of the initial values for the bias. The results are partially extended to other fractional models, and three different applications of the theoretical results are given....

  8. Multifractal analysis of visibility graph-based Ito-related connectivity time series.

    Science.gov (United States)

    Czechowski, Zbigniew; Lovallo, Michele; Telesca, Luciano

    2016-02-01

    In this study, we investigate multifractal properties of connectivity time series resulting from the visibility graph applied to normally distributed time series generated by the Ito equations with multiplicative power-law noise. We show that multifractality of the connectivity time series (i.e., the series of numbers of links outgoing any node) increases with the exponent of the power-law noise. The multifractality of the connectivity time series could be due to the width of connectivity degree distribution that can be related to the exit time of the associated Ito time series. Furthermore, the connectivity time series are characterized by persistence, although the original Ito time series are random; this is due to the procedure of visibility graph that, connecting the values of the time series, generates persistence but destroys most of the nonlinear correlations. Moreover, the visibility graph is sensitive for detecting wide "depressions" in input time series.

  9. Appropriate use of the increment entropy for electrophysiological time series.

    Science.gov (United States)

    Liu, Xiaofeng; Wang, Xue; Zhou, Xu; Jiang, Aimin

    2018-04-01

    The increment entropy (IncrEn) is a new measure for quantifying the complexity of a time series. There are three critical parameters in the IncrEn calculation: N (length of the time series), m (dimensionality), and q (quantifying precision). However, the question of how to choose the most appropriate combination of IncrEn parameters for short datasets has not been extensively explored. The purpose of this research was to provide guidance on choosing suitable IncrEn parameters for short datasets by exploring the effects of varying the parameter values. We used simulated data, epileptic EEG data and cardiac interbeat (RR) data to investigate the effects of the parameters on the calculated IncrEn values. The results reveal that IncrEn is sensitive to changes in m, q and N for short datasets (N≤500). However, IncrEn reaches stability at a data length of N=1000 with m=2 and q=2, and for short datasets (N=100), it shows better relative consistency with 2≤m≤6 and 2≤q≤8 We suggest that the value of N should be no less than 100. To enable a clear distinction between different classes based on IncrEn, we recommend that m and q should take values between 2 and 4. With appropriate parameters, IncrEn enables the effective detection of complexity variations in physiological time series, suggesting that IncrEn should be useful for the analysis of physiological time series in clinical applications. Copyright © 2018 Elsevier Ltd. All rights reserved.

  10. Dollar$ & $en$e. Part IV: Measuring the value of people, structural, and customer capital.

    Science.gov (United States)

    Wilkinson, I

    2001-01-01

    In Part I of this series, I introduced the concept of memes (1). Memes are ideas or concepts, the information world equivalent of genes. The goal of this series of articles is to infect you with my memes, so that you will assimilate, translate, and express them. We discovered that no matter what our area of expertise or "-ology," we all are in the information business. Our goal is to be in the wisdom business. We saw that when we convert raw data into wisdom we are moving along a value chain. Each step in the chain adds a different amount of value to the final product: timely, relevant, accurate, and precise knowledge which can then be applied to create the ultimate product in the value chain: wisdom. In Part II of this series, I infected you with a set of memes for measuring the cost of adding value (2). In Part III of this series, I infected you with a new set of memes for measuring the added value of knowledge, i.e., intellectual capital (3). In Part IV of this series, I will infect you with memes for measuring the value of people, structural, and customer capital.

  11. Complex network approach to fractional time series

    Energy Technology Data Exchange (ETDEWEB)

    Manshour, Pouya [Physics Department, Persian Gulf University, Bushehr 75169 (Iran, Islamic Republic of)

    2015-10-15

    In order to extract correlation information inherited in stochastic time series, the visibility graph algorithm has been recently proposed, by which a time series can be mapped onto a complex network. We demonstrate that the visibility algorithm is not an appropriate one to study the correlation aspects of a time series. We then employ the horizontal visibility algorithm, as a much simpler one, to map fractional processes onto complex networks. The degree distributions are shown to have parabolic exponential forms with Hurst dependent fitting parameter. Further, we take into account other topological properties such as maximum eigenvalue of the adjacency matrix and the degree assortativity, and show that such topological quantities can also be used to predict the Hurst exponent, with an exception for anti-persistent fractional Gaussian noises. To solve this problem, we take into account the Spearman correlation coefficient between nodes' degrees and their corresponding data values in the original time series.

  12. Time series prediction: statistical and neural techniques

    Science.gov (United States)

    Zahirniak, Daniel R.; DeSimio, Martin P.

    1996-03-01

    In this paper we compare the performance of nonlinear neural network techniques to those of linear filtering techniques in the prediction of time series. Specifically, we compare the results of using the nonlinear systems, known as multilayer perceptron and radial basis function neural networks, with the results obtained using the conventional linear Wiener filter, Kalman filter and Widrow-Hoff adaptive filter in predicting future values of stationary and non- stationary time series. Our results indicate the performance of each type of system is heavily dependent upon the form of the time series being predicted and the size of the system used. In particular, the linear filters perform adequately for linear or near linear processes while the nonlinear systems perform better for nonlinear processes. Since the linear systems take much less time to be developed, they should be tried prior to using the nonlinear systems when the linearity properties of the time series process are unknown.

  13. On the analyticity of Laguerre series

    International Nuclear Information System (INIS)

    Weniger, Ernst Joachim

    2008-01-01

    The transformation of a Laguerre series f(z) = Σ ∞ n=0 λ (α) n L (α) n (z) to a power series f(z) = Σ ∞ n=0 γ n z n is discussed. Since many nonanalytic functions can be expanded in terms of generalized Laguerre polynomials, success is not guaranteed and such a transformation can easily lead to a mathematically meaningless expansion containing power series coefficients that are infinite in magnitude. Simple sufficient conditions based on the decay rates and sign patterns of the Laguerre series coefficients λ (α) n as n → ∞ can be formulated which guarantee that the resulting power series represents an analytic function. The transformation produces a mathematically meaningful result if the coefficients λ (α) n either decay exponentially or factorially as n → ∞. The situation is much more complicated-but also much more interesting-if the λ (α) n decay only algebraically as n → ∞. If the λ (α) n ultimately have the same sign, the series expansions for the power series coefficients diverge, and the corresponding function is not analytic at the origin. If the λ (α) n ultimately have strictly alternating signs, the series expansions for the power series coefficients still diverge, but are summable to something finite, and the resulting power series represents an analytic function. If algebraically decaying and ultimately alternating Laguerre series coefficients λ (α) n possess sufficiently simple explicit analytical expressions, the summation of the divergent series for the power series coefficients can often be accomplished with the help of analytic continuation formulae for hypergeometric series p+1 F p , but if the λ (α) n have a complicated structure or if only their numerical values are available, numerical summation techniques have to be employed. It is shown that certain nonlinear sequence transformations-in particular the so-called delta transformation (Weniger 1989 Comput. Phys. Rep. 10 189-371 (equation (8.4-4)))-are able to

  14. Time-series prediction and applications a machine intelligence approach

    CERN Document Server

    Konar, Amit

    2017-01-01

    This book presents machine learning and type-2 fuzzy sets for the prediction of time-series with a particular focus on business forecasting applications. It also proposes new uncertainty management techniques in an economic time-series using type-2 fuzzy sets for prediction of the time-series at a given time point from its preceding value in fluctuating business environments. It employs machine learning to determine repetitively occurring similar structural patterns in the time-series and uses stochastic automaton to predict the most probabilistic structure at a given partition of the time-series. Such predictions help in determining probabilistic moves in a stock index time-series Primarily written for graduate students and researchers in computer science, the book is equally useful for researchers/professionals in business intelligence and stock index prediction. A background of undergraduate level mathematics is presumed, although not mandatory, for most of the sections. Exercises with tips are provided at...

  15. Refined composite multiscale weighted-permutation entropy of financial time series

    Science.gov (United States)

    Zhang, Yongping; Shang, Pengjian

    2018-04-01

    For quantifying the complexity of nonlinear systems, multiscale weighted-permutation entropy (MWPE) has recently been proposed. MWPE has incorporated amplitude information and been applied to account for the multiple inherent dynamics of time series. However, MWPE may be unreliable, because its estimated values show large fluctuation for slight variation of the data locations, and a significant distinction only for the different length of time series. Therefore, we propose the refined composite multiscale weighted-permutation entropy (RCMWPE). By comparing the RCMWPE results with other methods' results on both synthetic data and financial time series, RCMWPE method shows not only the advantages inherited from MWPE but also lower sensitivity to the data locations, more stable and much less dependent on the length of time series. Moreover, we present and discuss the results of RCMWPE method on the daily price return series from Asian and European stock markets. There are significant differences between Asian markets and European markets, and the entropy values of Hang Seng Index (HSI) are close to but higher than those of European markets. The reliability of the proposed RCMWPE method has been supported by simulations on generated and real data. It could be applied to a variety of fields to quantify the complexity of the systems over multiple scales more accurately.

  16. Fourier series

    CERN Document Server

    Tolstov, Georgi P

    1962-01-01

    Richard A. Silverman's series of translations of outstanding Russian textbooks and monographs is well-known to people in the fields of mathematics, physics, and engineering. The present book is another excellent text from this series, a valuable addition to the English-language literature on Fourier series.This edition is organized into nine well-defined chapters: Trigonometric Fourier Series, Orthogonal Systems, Convergence of Trigonometric Fourier Series, Trigonometric Series with Decreasing Coefficients, Operations on Fourier Series, Summation of Trigonometric Fourier Series, Double Fourie

  17. Usefulness of Reformatted CT Rib Series in Patients with Thoracic Trauma

    Energy Technology Data Exchange (ETDEWEB)

    Moon, Sung Nam; Park, Seong Hoon; Kim, Na Hyung; Juhng, Seon Kwan; Yoon, Kwon Ha [Dept. of Radiology and Institute for Radiological Imaging Science, Wonkwang University School of Medicine, Iksan (Korea, Republic of); Bang, Dong Ho [Dept. of Radiology, Aerospace Medical Center, Cheongwon (Korea, Republic of)

    2013-01-15

    To assess the value of adding a reformatted computed tomography (CT) rib series to transversely reconstructed CT imaging in the evaluation of rib fractures in patients with suspected traumatic thoracic injuries. One hundred consecutive patients with suspected traumatic thoracic injuries underwent 128-section multi-detector row CT. Transverse CT images with 5-mm-thick sections were reconstructed and rib series were reformatted using isotropic vogel data. Three independent radiologists, who were blinded to the data, interpreted the CT scans at 2 sessions with a 4-week interval between the sessions. Only transverse CT images were reviewed at the first session. At the second session, the CT images were reviewed along with the reformatted CT rib series. The following parameters were analyzed: receiver operating characteristic (Roc) curve, pairwise comparisons of Roc curves, sensitivity, specificity, positive predictive value, and negative predictive value. There were 153 rib fractures in 29 patients. The level of the area under the Roc curve, Az improved for all observers. The diagnostic sensitivity and specificity of each observer tended to improve in the second session. The mean confidence scores for all observers of patients with rib fractures improved significantly in the second session. A reformatted CT rib series together with transverse CT scan is useful for the evaluation of rib fracture.

  18. Complexity analysis of the turbulent environmental fluid flow time series

    Science.gov (United States)

    Mihailović, D. T.; Nikolić-Đorić, E.; Drešković, N.; Mimić, G.

    2014-02-01

    We have used the Kolmogorov complexities, sample and permutation entropies to quantify the randomness degree in river flow time series of two mountain rivers in Bosnia and Herzegovina, representing the turbulent environmental fluid, for the period 1926-1990. In particular, we have examined the monthly river flow time series from two rivers (the Miljacka and the Bosnia) in the mountain part of their flow and then calculated the Kolmogorov complexity (KL) based on the Lempel-Ziv Algorithm (LZA) (lower-KLL and upper-KLU), sample entropy (SE) and permutation entropy (PE) values for each time series. The results indicate that the KLL, KLU, SE and PE values in two rivers are close to each other regardless of the amplitude differences in their monthly flow rates. We have illustrated the changes in mountain river flow complexity by experiments using (i) the data set for the Bosnia River and (ii) anticipated human activities and projected climate changes. We have explored the sensitivity of considered measures in dependence on the length of time series. In addition, we have divided the period 1926-1990 into three subintervals: (a) 1926-1945, (b) 1946-1965, (c) 1966-1990, and calculated the KLL, KLU, SE, PE values for the various time series in these subintervals. It is found that during the period 1946-1965, there is a decrease in their complexities, and corresponding changes in the SE and PE, in comparison to the period 1926-1990. This complexity loss may be primarily attributed to (i) human interventions, after the Second World War, on these two rivers because of their use for water consumption and (ii) climate change in recent times.

  19. Computer simulation of the natural U 238 and U 235 radioactive series decay

    International Nuclear Information System (INIS)

    Barna, A.; Oncescu, M.

    1980-01-01

    The principles of the computer simulation of a radionuclide decay - its decay scheme adoption and codification -, and the adoption principle of a radionuclide chain in a series are applied to the natural U 238 and U 235 series radionuclide decay computer simulation. Using the computer simulation data of these two series adopted chains, the decay characteristic quantities of the series radionuclides, the gamma spectra and the basic characteristics of each of these series are determined and compared with the experimental values given in the literature. (author)

  20. 21 CFR 352.73 - Determination of SPF value.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 5 2010-04-01 2010-04-01 false Determination of SPF value. 352.73 Section 352.73... Procedures § 352.73 Determination of SPF value. (a)(1) The following erythema action spectrum shall be used... used in calculating the SPF. (c) Determination of individual SPF values. A series of UV radiation...

  1. Rethinking Value in the Bio-economy: Finance, Assetization, and the Management of Value.

    Science.gov (United States)

    Birch, Kean

    2017-05-01

    Current debates in science and technology studies emphasize that the bio-economy-or, the articulation of capitalism and biotechnology-is built on notions of commodity production, commodification, and materiality, emphasizing that it is possible to derive value from body parts, molecular and cellular tissues, biological processes, and so on. What is missing from these perspectives, however, is consideration of the political-economic actors, knowledges, and practices involved in the creation and management of value. As part of a rethinking of value in the bio-economy, this article analyzes three key political-economic processes: financialization, capitalization, and assetization. In doing so, it argues that value is managed as part of a series of valuation practices, it is not inherent in biological materialities.

  2. `Indoor` series vending machines; `Indoor` series jido hanbaiki

    Energy Technology Data Exchange (ETDEWEB)

    Gensui, T.; Kida, A. [Fuji Electric Co. Ltd., Tokyo (Japan); Okumura, H. [Fuji Denki Reiki Co. Ltd., Tokyo (Japan)

    1996-07-10

    This paper introduces three series of vending machines that were designed to match the interior of an office building. The three series are vending machines for cups, paper packs, cans, and tobacco. Among the three series, `Interior` series has a symmetric design that was coated in a grain pattern. The inside of the `Interior` series is coated by laser satin to ensure a sense of superior quality and a refined style. The push-button used for product selection is hot-stamped on the plastic surface to ensure the hair-line luster. `Interior Phase II` series has a bay window design with a sense of superior quality and lightness. The inside of the `Interior Phase II` series is coated by laser satin. `Interior 21` series is integrated with the wall except the sales operation panel. The upper and lower dress panels can be detached and attached. The door lock is a wire-type structure with high operativity. The operation block is coated by titanium color. The dimensions of three series are standardized. 6 figs., 1 tab.

  3. Springer handbook of nanotechnology

    CERN Document Server

    2017-01-01

    This comprehensive handbook has become the definitive reference work in the field of nanoscience and nanotechnology, and this 4th edition incorporates a number of recent new developments. It integrates nanofabrication, nanomaterials, nanodevices, nanomechanics, nanotribology, materials science, and reliability engineering knowledge in just one volume. Furthermore, it discusses various nanostructures; micro/nanofabrication; micro/nanodevices and biomicro/nanodevices, as well as scanning probe microscopy; nanotribology and nanomechanics; molecularly thick films; industrial applications and nanodevice reliability; societal, environmental, health and safety issues; and nanotechnology education. In this new edition, written by an international team of over 140 distinguished experts and put together by an experienced editor with a comprehensive understanding of the field, almost all the chapters are either new or substantially revised and expanded, with new topics of interest added. It is an essential resource for ...

  4. Predicting long-term catchment nutrient export: the use of nonlinear time series models

    Science.gov (United States)

    Valent, Peter; Howden, Nicholas J. K.; Szolgay, Jan; Komornikova, Magda

    2010-05-01

    After the Second World War the nitrate concentrations in European water bodies changed significantly as the result of increased nitrogen fertilizer use and changes in land use. However, in the last decades, as a consequence of the implementation of nitrate-reducing measures in Europe, the nitrate concentrations in water bodies slowly decrease. This causes that the mean and variance of the observed time series also changes with time (nonstationarity and heteroscedascity). In order to detect changes and properly describe the behaviour of such time series by time series analysis, linear models (such as autoregressive (AR), moving average (MA) and autoregressive moving average models (ARMA)), are no more suitable. Time series with sudden changes in statistical characteristics can cause various problems in the calibration of traditional water quality models and thus give biased predictions. Proper statistical analysis of these non-stationary and heteroscedastic time series with the aim of detecting and subsequently explaining the variations in their statistical characteristics requires the use of nonlinear time series models. This information can be then used to improve the model building and calibration of conceptual water quality model or to select right calibration periods in order to produce reliable predictions. The objective of this contribution is to analyze two long time series of nitrate concentrations of the rivers Ouse and Stour with advanced nonlinear statistical modelling techniques and compare their performance with traditional linear models of the ARMA class in order to identify changes in the time series characteristics. The time series were analysed with nonlinear models with multiple regimes represented by self-exciting threshold autoregressive (SETAR) and Markov-switching models (MSW). The analysis showed that, based on the value of residual sum of squares (RSS) in both datasets, SETAR and MSW models described the time-series better than models of the

  5. Integrating Social Justice into the Practice of CBFT: A Critical Look at Family Schemas.

    Science.gov (United States)

    Parker, Elizabeth O; McDowell, Teresa

    2017-07-01

    Many families come to therapy struggling with the negative consequence of social inequity. Family therapy modalities have been developed to address these negative consequences and attend to power and social equity (Transformative family therapy: Just families in a just society. Boston, MA: Pearson Education; Socio-emotional relationship therapy. New York, NY: Springer). We argue that many family therapy modalities can be adapted to include social equity (Applying critical social theory in family therapy practice. AFTA Springer Series. New York, NY: Springer Publishing). Specifically, cognitive behavioral family therapy can be used to address the inequality in social systems that negatively affect the family system. We focus on schema formation and suggest an emphasis on societal schemas within the therapy milieu as a tool to help families see how societal inequality can affect the problems faced in family life. © 2016 American Association for Marriage and Family Therapy.

  6. Neural network versus classical time series forecasting models

    Science.gov (United States)

    Nor, Maria Elena; Safuan, Hamizah Mohd; Shab, Noorzehan Fazahiyah Md; Asrul, Mohd; Abdullah, Affendi; Mohamad, Nurul Asmaa Izzati; Lee, Muhammad Hisyam

    2017-05-01

    Artificial neural network (ANN) has advantage in time series forecasting as it has potential to solve complex forecasting problems. This is because ANN is data driven approach which able to be trained to map past values of a time series. In this study the forecast performance between neural network and classical time series forecasting method namely seasonal autoregressive integrated moving average models was being compared by utilizing gold price data. Moreover, the effect of different data preprocessing on the forecast performance of neural network being examined. The forecast accuracy was evaluated using mean absolute deviation, root mean square error and mean absolute percentage error. It was found that ANN produced the most accurate forecast when Box-Cox transformation was used as data preprocessing.

  7. Using historic earnings to value hydro energy

    International Nuclear Information System (INIS)

    Robson, I.A.; Whittington, H.W.

    1993-01-01

    This article briefly presents a technique for assigning a value to the water held in and removed from the hydro reservoir. Using historic earnings as the basis for a series of equations, it aims to give engineers trading energy a reliable means of placing a value on what is effectively a ''free'' resource. (Author)

  8. Model-based Clustering of Categorical Time Series with Multinomial Logit Classification

    Science.gov (United States)

    Frühwirth-Schnatter, Sylvia; Pamminger, Christoph; Winter-Ebmer, Rudolf; Weber, Andrea

    2010-09-01

    A common problem in many areas of applied statistics is to identify groups of similar time series in a panel of time series. However, distance-based clustering methods cannot easily be extended to time series data, where an appropriate distance-measure is rather difficult to define, particularly for discrete-valued time series. Markov chain clustering, proposed by Pamminger and Frühwirth-Schnatter [6], is an approach for clustering discrete-valued time series obtained by observing a categorical variable with several states. This model-based clustering method is based on finite mixtures of first-order time-homogeneous Markov chain models. In order to further explain group membership we present an extension to the approach of Pamminger and Frühwirth-Schnatter [6] by formulating a probabilistic model for the latent group indicators within the Bayesian classification rule by using a multinomial logit model. The parameters are estimated for a fixed number of clusters within a Bayesian framework using an Markov chain Monte Carlo (MCMC) sampling scheme representing a (full) Gibbs-type sampler which involves only draws from standard distributions. Finally, an application to a panel of Austrian wage mobility data is presented which leads to an interesting segmentation of the Austrian labour market.

  9. Participatory public health systems research: value of community involvement in a study series in mental health emergency preparedness.

    Science.gov (United States)

    McCabe, O Lee; Marum, Felicity; Semon, Natalie; Mosley, Adrian; Gwon, Howard; Perry, Charlene; Moore, Suzanne Straub; Links, Jonathan M

    2012-01-01

    Concerns have arisen over recent years about the absence of empirically derived evidence on which to base policy and practice in the public health system, in general, and to meet the challenge of public health emergency preparedness, in particular. Related issues include the challenge of disaster-caused, behavioral health surge, and the frequent exclusion of populations from studies that the research is meant to aid. To characterize the contributions of nonacademic collaborators to a series of projects validating a set of interventions to enhance capacity and competency of public mental health preparedness planning and response. Urban, suburban, and rural communities of the state of Maryland and rural communities of the state of Iowa. Study partners and participants (both of this project and the studies examined) were representatives of academic health centers (AHCs), local health departments (LHDs), and faith-based organizations (FBOs) and their communities. A multiple-project, case study analysis was conducted, that is, four research projects implemented by the authors from 2005 through 2011 to determine the types and impact of contributions made by nonacademic collaborators to those projects. The analysis involved reviewing research records, conceptualizing contributions (and providing examples) for government, faith, and (nonacademic) institutional collaborators. Ten areas were identified where partners made valuable contributions to the study series; these "value-areas" were as follows: 1) leadership and management of the projects; 2) formulation and refinement of research topics, aims, etc; 3) recruitment and retention of participants; 4) design and enhancement of interventions; 5) delivery of interventions; 6) collection, analysis, and interpretation of data; 7) dissemination of findings; 8) ensuring sustainability of faith/government preparedness planning relationships; 9) optimizing scalability and portability of the model; and 10) facilitating

  10. Las series televisivas juveniles: tramas y conflictos en una «teen series» Television Fiction Series Targeted at Young Audience: Plots and Conflicts Portrayed in a Teen Series

    Directory of Open Access Journals (Sweden)

    Núria García Muñoz

    2011-10-01

    Full Text Available Se presentan los principales hallazgos de un estudio sobre las «teen series», es decir las series de ficción televisiva protagonizadas por personajes adolescentes y dirigidas expresamente a una audiencia juvenil. El análisis del retrato de los jóvenes representados en productos específicamente dirigidos a un público juvenil tiene un valor muy significativo tanto por la producción de ficción como por la recepción, ya que los consumidores potenciales se encuentran en un momento clave del proceso de construcción de sus identidades. Después de repasar los principales antecedentes en el estudio de la representación de los jóvenes en la ficción televisiva, se describe el marco conceptual relativo a las «teen series» y se discute su relación con el consumo juvenil. Sucesivamente se presenta un estudio de caso que consiste en un análisis de contenido de la serie norteamericana «Dawson’s creek», realizado sobre una muestra representativa de tres temporadas de la serie, para analizar dos grupos de variables: variables relativas a los personajes y variables relativas a las tramas y a los conflictos. Se discuten los resultados relativos al segundo grupo de variables, con particular atención a las características de las tramas y al papel de los personajes en el desarrollo y en la resolución de las mismas. La aceptación de la identidad personal, el amor y la amistad han resultado ser las temáticas más recurrentes. Además, las relaciones sociales entre los personajes han resultado ejercer un papel fundamental en el desarrollo de las tramas y de los conflictos.This paper presents the main findings of a research project on teen series, which are television fiction series featuring teenagers and specifically targeted at a young audience. The analysis of the portrayal of young people in television fictional series specifically targeted at a young audience has a meaningful value both for television production and for audience reception

  11. RADON CONCENTRATION TIME SERIES MODELING AND APPLICATION DISCUSSION.

    Science.gov (United States)

    Stránský, V; Thinová, L

    2017-11-01

    In the year 2010 a continual radon measurement was established at Mladeč Caves in the Czech Republic using a continual radon monitor RADIM3A. In order to model radon time series in the years 2010-15, the Box-Jenkins Methodology, often used in econometrics, was applied. Because of the behavior of radon concentrations (RCs), a seasonal integrated, autoregressive moving averages model with exogenous variables (SARIMAX) has been chosen to model the measured time series. This model uses the time series seasonality, previously acquired values and delayed atmospheric parameters, to forecast RC. The developed model for RC time series is called regARIMA(5,1,3). Model residuals could be retrospectively compared with seismic evidence of local or global earthquakes, which occurred during the RCs measurement. This technique enables us to asses if continuously measured RC could serve an earthquake precursor. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  12. Comparison of annual maximum series and partial duration series methods for modeling extreme hydrologic events

    DEFF Research Database (Denmark)

    Madsen, Henrik; Rasmussen, Peter F.; Rosbjerg, Dan

    1997-01-01

    Two different models for analyzing extreme hydrologic events, based on, respectively, partial duration series (PDS) and annual maximum series (AMS), are compared. The PDS model assumes a generalized Pareto distribution for modeling threshold exceedances corresponding to a generalized extreme value......). In the case of ML estimation, the PDS model provides the most efficient T-year event estimator. In the cases of MOM and PWM estimation, the PDS model is generally preferable for negative shape parameters, whereas the AMS model yields the most efficient estimator for positive shape parameters. A comparison...... of the considered methods reveals that in general, one should use the PDS model with MOM estimation for negative shape parameters, the PDS model with exponentially distributed exceedances if the shape parameter is close to zero, the AMS model with MOM estimation for moderately positive shape parameters, and the PDS...

  13. On Stabilizing the Variance of Dynamic Functional Brain Connectivity Time Series.

    Science.gov (United States)

    Thompson, William Hedley; Fransson, Peter

    2016-12-01

    Assessment of dynamic functional brain connectivity based on functional magnetic resonance imaging (fMRI) data is an increasingly popular strategy to investigate temporal dynamics of the brain's large-scale network architecture. Current practice when deriving connectivity estimates over time is to use the Fisher transformation, which aims to stabilize the variance of correlation values that fluctuate around varying true correlation values. It is, however, unclear how well the stabilization of signal variance performed by the Fisher transformation works for each connectivity time series, when the true correlation is assumed to be fluctuating. This is of importance because many subsequent analyses either assume or perform better when the time series have stable variance or adheres to an approximate Gaussian distribution. In this article, using simulations and analysis of resting-state fMRI data, we analyze the effect of applying different variance stabilization strategies on connectivity time series. We focus our investigation on the Fisher transformation, the Box-Cox (BC) transformation and an approach that combines both transformations. Our results show that, if the intention of stabilizing the variance is to use metrics on the time series, where stable variance or a Gaussian distribution is desired (e.g., clustering), the Fisher transformation is not optimal and may even skew connectivity time series away from being Gaussian. Furthermore, we show that the suboptimal performance of the Fisher transformation can be substantially improved by including an additional BC transformation after the dynamic functional connectivity time series has been Fisher transformed.

  14. Regularization and asymptotic expansion of certain distributions defined by divergent series

    Directory of Open Access Journals (Sweden)

    Ricardo Estrada

    1995-01-01

    Full Text Available The regularization of the distribution ∑n=−∞∞δ(x−pn. which gives a regularized value to the divergent series ∑n=−∞∞φ(pn is obtained in several spaces of test functions. The asymptotic expansion as ϵ→0+of series of the type ∑n=0∞φ(ϵ pn is also obtained.

  15. Adjusted monthly temperature and precipitation values for Guinea Conakry (1941-2010) using HOMER.

    Science.gov (United States)

    Aguilar, Enric; Aziz Barry, Abdoul; Mestre, Olivier

    2013-04-01

    Africa is a data sparse region and there are very few studies presenting homogenized monthly records. In this work, we introduce a dataset consisting of 12 stations spread over Guinea Conakry containing daily values of maximum and minimum temperature and accumulated rainfall for the period 1941-2010. The daily values have been quality controlled using R-Climdex routines, plus other interactive quality control applications, coded by the authors. After applying the different tests, more than 200 daily values were flagged as doubtful and carefully checked against the statistical distribution of the series and the rest of the dataset. Finally, 40 values were modified or set to missing and the rest were validated. The quality controlled daily dataset was used to produce monthly means and homogenized with HOMER, a new R-pacakge which includes the relative methods that performed better in the experiments conducted in the framework of the COST-HOME action. A total number of 38 inhomogeneities were found for temperature. As a total of 788 years of data were analyzed, the average ratio was one break every 20.7 years. The station with a larger number of inhomogeneities was Conakry (5 breaks) and one station, Kissidougou, was identified as homogeneous. The average number of breaks/station was 3.2. The mean value of the monthly factors applied to maximum (minimum) temperature was 0.17 °C (-1.08 °C) . For precipitation, due to the demand of a denser network to correctly homogenize this variable, only two major inhomogeneities in Conakry (1941-1961, -12%) and Kindia (1941-1976, -10%) were corrected. The adjusted dataset was used to compute regional series for the three variables and trends for the 1941-2010 period. The regional mean has been computed by simply averaging anomalies to 1971-2000 of the 12 time series. Two different versions have been obtained: a first one (A) makes use of the missing values interpolation made by HOMER (so all annual values in the regional series

  16. Values and Opportunities in Social Entrepreneurship

    OpenAIRE

    Hockerts, Kai; Mair, Johanna (Professor); Robinson, Jeffrey

    2016-01-01

    Over the past few years social entrepreneurship has grown as a research field. In this 3rd volume in the series, contributions explore questions of values in social entrepreneurship as well as the identification and exploitation of social venturing opportunities.

  17. On Sums of Numerical Series and Fourier Series

    Science.gov (United States)

    Pavao, H. Germano; de Oliveira, E. Capelas

    2008-01-01

    We discuss a class of trigonometric functions whose corresponding Fourier series, on a conveniently chosen interval, can be used to calculate several numerical series. Particular cases are presented and two recent results involving numerical series are recovered. (Contains 1 note.)

  18. A comment on measuring the Hurst exponent of financial time series

    Science.gov (United States)

    Couillard, Michel; Davison, Matt

    2005-03-01

    A fundamental hypothesis of quantitative finance is that stock price variations are independent and can be modeled using Brownian motion. In recent years, it was proposed to use rescaled range analysis and its characteristic value, the Hurst exponent, to test for independence in financial time series. Theoretically, independent time series should be characterized by a Hurst exponent of 1/2. However, finite Brownian motion data sets will always give a value of the Hurst exponent larger than 1/2 and without an appropriate statistical test such a value can mistakenly be interpreted as evidence of long term memory. We obtain a more precise statistical significance test for the Hurst exponent and apply it to real financial data sets. Our empirical analysis shows no long-term memory in some financial returns, suggesting that Brownian motion cannot be rejected as a model for price dynamics.

  19. Comparative study of series of solar radiation; Estudio comparativo de series de radiacion solar

    Energy Technology Data Exchange (ETDEWEB)

    Adaro, Agustin; Cesari, Daniela; Lema, Alba; Galimberti, Pablo; Barral, Jorge [Universidad Nacional de Rio Cuarto, (Argentina)

    2000-07-01

    In any team or solar device that it seeks to be designed and dedicated to the use of the solar energy it will be had the most appropriate information on the radiation levels. Being this source of dependent energy of the atmospheric and meteorological fluctuations, it is that requires have the information best regarding the quantity and variability of the available solar energy. A road is already the statistical treatment of the data available, so much of solar radiation as of hours of sun. This focus generates a lot of expectation for the biggest quantity in information regarding the hours of existent sun. This bigger information of hours of sun is due to that the mensurations are carried out with instruments called heliografos with a level of complexity and much smaller cost that the instruments of radiation mensuration. Among the heliografos the most used one is that of Campbell-Stokes, and it is the one that you had installed in most of the meteorological stations of Argentina and the World, for what the information of hours of sun is the one that more is plentiful. The present work has for objective to find an interrelation between the measured series of hours of sun and irradiation. The study is carried out using models of temporary series and the pattern of Angstrom-Page. The are carried out a study of the generation of radiation sequences using models of temporary series and the pattern of Angstrom-Page. They are carried out a study of the generation of radiation sequences using the concept of the Chains of Markov. Rio Cuarto's series are analyzed for being determined the transfer function among both series, and the values of global solar radiation are obtained for towns of the same region. They are the coefficients of Anstrom-Page's Equation for Rio Cuarto. They are the values monthly means for these two methods and results are compared. [Spanish] En cualquier equipo o dispositivo solar que pretenda ser disenado y destinado al aprovechamiento de

  20. Pre-Proceedings of the 1st International Workshop on Process-oriented Information Systems in Healthcare (ProHealth'07)

    NARCIS (Netherlands)

    Reichert, M.U.; Peleg, M.; Lenz, R.

    These pre-proceedings contain the presentations given at the 1st Int'l Workshop on Process-oriented Information Systems in Healthcare (ProHealth'07). Formal proceedings will be published in Springer's LNCS series. Process-oriented information systems have been demanded for more than 20 years and

  1. Modelling financial high frequency data using point processes

    DEFF Research Database (Denmark)

    Hautsch, Nikolaus; Bauwens, Luc

    In this chapter written for a forthcoming Handbook of Financial Time Series to be published by Springer-Verlag, we review the econometric literature on dynamic duration and intensity processes applied to high frequency financial data, which was boosted by the work of Engle and Russell (1997...

  2. Infinite series

    CERN Document Server

    Hirschman, Isidore Isaac

    2014-01-01

    This text for advanced undergraduate and graduate students presents a rigorous approach that also emphasizes applications. Encompassing more than the usual amount of material on the problems of computation with series, the treatment offers many applications, including those related to the theory of special functions. Numerous problems appear throughout the book.The first chapter introduces the elementary theory of infinite series, followed by a relatively complete exposition of the basic properties of Taylor series and Fourier series. Additional subjects include series of functions and the app

  3. THE IMPLEMENTATION OF STRATEGIC MANAGEMENTACCOUNTING BASED ON VALUE CHAIN ANALYSIS: VALUE CHAINACCOUNTING

    Directory of Open Access Journals (Sweden)

    Mustafa KIRLI

    2011-01-01

    Full Text Available To compete successfully in today’s highly competitive global environment,companies have made customer satisfaction an overriding priority. They have alsoadopted new management approaches, changed their manufacturing systems andinvested in new technologies. Strategic managementaccounting examines thedecision-making linked with the business operationsand strategic work offinancial administration as support for the same. Strategic managementaccounting is a theory and practice of accounting that looks at an organization'scost position, cost advantages and product differentiation in order to make marketdecisions. The value chain is a systematic approachto examining the developmentof competitive advantage. The chain consists of a series of activities that createand build value. Value chain analysis refers to a structured method of analyzingthe effects of all core activities on cost and/or differentiation of the valuechain.With the growing division of labour and the global dispersion of theproduction ofcomponents, systemic competitiveness and so value chain analysishave become increasingly important. Value chain accounting is the combinationof value chain analysis and accounting theory.Valuechain accounting is animportant part of value chain management and a further development of strategicmanagement accounting. Value chain accounting is anew approach onaccounting subject which is combined by the theories of value chain management,supply chain management, accounting management andinformation technology.From the analysis about value chain theory and strategic management accountingtheory,this paper proposes an accounting managementframework based on valuechain analysis called value chain accounting.

  4. Adaptive time-variant models for fuzzy-time-series forecasting.

    Science.gov (United States)

    Wong, Wai-Keung; Bai, Enjian; Chu, Alice Wai-Ching

    2010-12-01

    A fuzzy time series has been applied to the prediction of enrollment, temperature, stock indices, and other domains. Related studies mainly focus on three factors, namely, the partition of discourse, the content of forecasting rules, and the methods of defuzzification, all of which greatly influence the prediction accuracy of forecasting models. These studies use fixed analysis window sizes for forecasting. In this paper, an adaptive time-variant fuzzy-time-series forecasting model (ATVF) is proposed to improve forecasting accuracy. The proposed model automatically adapts the analysis window size of fuzzy time series based on the prediction accuracy in the training phase and uses heuristic rules to generate forecasting values in the testing phase. The performance of the ATVF model is tested using both simulated and actual time series including the enrollments at the University of Alabama, Tuscaloosa, and the Taiwan Stock Exchange Capitalization Weighted Stock Index (TAIEX). The experiment results show that the proposed ATVF model achieves a significant improvement in forecasting accuracy as compared to other fuzzy-time-series forecasting models.

  5. Hidden costs, value lost: uninsurance in America

    National Research Council Canada - National Science Library

    Committee on the Consequences of Uninsurance

    2003-01-01

    Hidden Cost, Value Lost , the fifth of a series of six books on the consequences of uninsurance in the United States, illustrates some of the economic and social losses to the country of maintaining...

  6. Transformational principles for NEON sampling of mammalian parasites and pathogens: a response to Springer et al. (2016)

    Science.gov (United States)

    The National Environmental Observatory Network (NEON) has recently released a series of protocols presented with apparently broad community support for studies of small mammals and parasites. Sampling designs were outlined outlined, collectively aimed at understanding how changing environmental cond...

  7. Haptic Routes and digestive destinations in cooking series

    DEFF Research Database (Denmark)

    Waade, Anne Marit; Jørgensen, Ulla Angkjær

    2010-01-01

    and the media in which aesthetical, cultural and symbolic values are related to the way food is mediatised. The main argument is that cooking television series produce haptic images of place and food that include a specific sensuous and emotional relation between screen and viewer. The haptic imagery...

  8. Using probabilistic finite automata to simulate hourly series of global radiation

    Energy Technology Data Exchange (ETDEWEB)

    Mora-Lopez, L. [Universidad de Malaga (Spain). Dpto. Lenguajes y Computacion; Sidrach-de-Cardona, M. [Universidad de Malaga (Spain). Dpto. Fisica Aplicada II

    2003-03-01

    A model to generate synthetic series of hourly exposure of global radiation is proposed. This model has been constructed using a machine learning approach. It is based on the use of a subclass of probabilistic finite automata which can be used for variable-order Markov processes. This model allows us to represent the different relationships and the representative information observed in the hourly series of global radiation; the variable-order Markov process can be used as a natural way to represent different types of days, and to take into account the ''variable memory'' of cloudiness. A method to generate new series of hourly global radiation, which incorporates the randomness observed in recorded series, is also proposed. As input data this method only uses the mean monthly value of the daily solar global radiation. We examine if the recorded and simulated series are similar. It can be concluded that both series have the same statistical properties. (author)

  9. Protection of simple series and parallel systems with components of different values

    International Nuclear Information System (INIS)

    Bier, Vicki M.; Nagaraj, Aniruddha; Abhichandani, Vinod

    2005-01-01

    We apply game theory, optimization, and reliability analysis to identify optimal defenses against intentional threats to system reliability. The goals are to identify optimal strategies for allocating resources among possible defensive investments, and to develop qualitative guidelines that reflect those strategies. The novel feature of the approach is the use of reliability analysis together with game theory and optimization to study optimal management of intentional threats to system reliability. Thus, this work extends and adapts the existing body of game-theoretic work on security to systems with series or parallel structures. The results yield insights into the nature of optimal defensive investments that yield the best tradeoff between investment cost and security. In particular, the results illustrate how the optimal allocation of defensive investments depends on the structure of the system, the cost-effectiveness of infrastructure protection investments, and the adversary's goals and constraints

  10. Fast response double series resonant high-voltage DC-DC converter

    International Nuclear Information System (INIS)

    Lee, S S; Iqbal, S; Kamarol, M

    2012-01-01

    In this paper, a novel double series resonant high-voltage dc-dc converter with dual-mode pulse frequency modulation (PFM) control scheme is proposed. The proposed topology consists of two series resonant tanks and hence two resonant currents flow in each switching period. Moreover, it consists of two high-voltage transformer with the leakage inductances are absorbed as resonant inductor in the series resonant tanks. The secondary output of both transformers are rectified and mixed before supplying to load. In the resonant mode operation, the series resonant tanks are energized alternately by controlling two Insulated Gate Bipolar Transistor (IGBT) switches with pulse frequency modulation (PFM). This topology operates in discontinuous conduction mode (DCM) with all IGBT switches operating in zero current switching (ZCS) condition and hence no switching loss occurs. To achieve fast rise in output voltage, a dual-mode PFM control during start-up of the converter is proposed. In this operation, the inverter is started at a high switching frequency and as the output voltage reaches 90% of the target value, the switching frequency is reduced to a value which corresponds to the target output voltage. This can effectively reduce the rise time of the output voltage and prevent overshoot. Experimental results collected from a 100-W laboratory prototype are presented to verify the effectiveness of the proposed system.

  11. The Black Man on Film: Racial Stereotyping. Hayden Film Attitudes and Issues Series.

    Science.gov (United States)

    Maynard, Richard A.

    Motion pictures have long been recognized as a mirror of society's values and attitudes, and for motivational and impression-making impact they are unsurpassed. The Hayden Film Attitudes and Issues Series is based on the teacher's source book, the Celluloid Curriculum: How to Use Movies in the Classroom. This series presents written sources…

  12. A Time Series Forecasting Method

    Directory of Open Access Journals (Sweden)

    Wang Zhao-Yu

    2017-01-01

    Full Text Available This paper proposes a novel time series forecasting method based on a weighted self-constructing clustering technique. The weighted self-constructing clustering processes all the data patterns incrementally. If a data pattern is not similar enough to an existing cluster, it forms a new cluster of its own. However, if a data pattern is similar enough to an existing cluster, it is removed from the cluster it currently belongs to and added to the most similar cluster. During the clustering process, weights are learned for each cluster. Given a series of time-stamped data up to time t, we divide it into a set of training patterns. By using the weighted self-constructing clustering, the training patterns are grouped into a set of clusters. To estimate the value at time t + 1, we find the k nearest neighbors of the input pattern and use these k neighbors to decide the estimation. Experimental results are shown to demonstrate the effectiveness of the proposed approach.

  13. Arab drama series content analysis from a transnational Arab identity perspective

    Directory of Open Access Journals (Sweden)

    Joelle Chamieh

    2016-04-01

    Full Text Available The scientific contribution in deciphering drama series falls under the discipline of understanding the narratology of distinctive cultures and traditions within specific contexts of certain societies. This article spells out the interferences deployed by the provocations that are induced through the functions of values in modeling societies which are projected through the transmission of media. The proposed operational model consists of providing an à priori design of common Arab values assimilated into an innovative grid analysis code book that has enabled the execution of a systematic and reliable approach to the quantitative content analysis performance. Additionally, a more thorough qualitative content analysis has been implemented in terms of narratolgy where actions have been evaluated based on the grid analysis code book for a clearer perception of Arab values depicted in terms of their context within the Arab drama milieu. This approach has been deployed on four Arab drama series covering the transnational/national and non-divisive/divisive media aspects in the intention of extracting the transmitted values from a common identity perspective for cause of divulging Arab people’s expectancies.

  14. Changes in ecosystem service values in Zhoushan Island using remote sensing time series data

    Science.gov (United States)

    Zhang, Xiaoping; Qin, Yanpei; Lv, Ying; Zhen, Guangwei; Gong, Fang; Li, Chaokui

    2017-10-01

    The largest inhabited island, Zhoushan Island, is the center of economy, culture, shipping, and fishing in the Zhoushan Archipelago New Area. Its coastal wetland and tidal flats offer significant ecological services including floodwater storage, wildlife habitat, and buffers against tidal surges. Yet, large-scale land reclamation and new land development may dramatically change ecosystem services. In this research, we assess changes in ecosystem service values in Zhoushan Island during 1990-2000-2011. Three LANDSAT TM and/or ETM data sets were used to determine the spatial pattern of land use, and previously published value coefficients were used to calculate the ecosystem service values delivered by each land category. The results show that total value of ecosystem services in Zhoushan Island declined by 11% from 2920.07 billion Yuan to 2609.77 billion Yuan per year between 1990 and 2011. This decrease is largely attributable to the 51% loss of tidal flats. The combined ecosystem service values of woodland, paddy land and tidal flats were over 90% of the total values. The result indicates that future land-use policy should pay attention to the conservation of these ecosystems over uncontrolled reclamation and coastal industrial development, and that further coastal reclamation should be on rigorous environmental impact analyses.

  15. Evaluation of scaling invariance embedded in short time series.

    Directory of Open Access Journals (Sweden)

    Xue Pan

    Full Text Available Scaling invariance of time series has been making great contributions in diverse research fields. But how to evaluate scaling exponent from a real-world series is still an open problem. Finite length of time series may induce unacceptable fluctuation and bias to statistical quantities and consequent invalidation of currently used standard methods. In this paper a new concept called correlation-dependent balanced estimation of diffusion entropy is developed to evaluate scale-invariance in very short time series with length ~10(2. Calculations with specified Hurst exponent values of 0.2,0.3,...,0.9 show that by using the standard central moving average de-trending procedure this method can evaluate the scaling exponents for short time series with ignorable bias (≤0.03 and sharp confidential interval (standard deviation ≤0.05. Considering the stride series from ten volunteers along an approximate oval path of a specified length, we observe that though the averages and deviations of scaling exponents are close, their evolutionary behaviors display rich patterns. It has potential use in analyzing physiological signals, detecting early warning signals, and so on. As an emphasis, the our core contribution is that by means of the proposed method one can estimate precisely shannon entropy from limited records.

  16. Evaluation of scaling invariance embedded in short time series.

    Science.gov (United States)

    Pan, Xue; Hou, Lei; Stephen, Mutua; Yang, Huijie; Zhu, Chenping

    2014-01-01

    Scaling invariance of time series has been making great contributions in diverse research fields. But how to evaluate scaling exponent from a real-world series is still an open problem. Finite length of time series may induce unacceptable fluctuation and bias to statistical quantities and consequent invalidation of currently used standard methods. In this paper a new concept called correlation-dependent balanced estimation of diffusion entropy is developed to evaluate scale-invariance in very short time series with length ~10(2). Calculations with specified Hurst exponent values of 0.2,0.3,...,0.9 show that by using the standard central moving average de-trending procedure this method can evaluate the scaling exponents for short time series with ignorable bias (≤0.03) and sharp confidential interval (standard deviation ≤0.05). Considering the stride series from ten volunteers along an approximate oval path of a specified length, we observe that though the averages and deviations of scaling exponents are close, their evolutionary behaviors display rich patterns. It has potential use in analyzing physiological signals, detecting early warning signals, and so on. As an emphasis, the our core contribution is that by means of the proposed method one can estimate precisely shannon entropy from limited records.

  17. An algorithm of Saxena-Easo on fuzzy time series forecasting

    Science.gov (United States)

    Ramadhani, L. C.; Anggraeni, D.; Kamsyakawuni, A.; Hadi, A. F.

    2018-04-01

    This paper presents a forecast model of Saxena-Easo fuzzy time series prediction to study the prediction of Indonesia inflation rate in 1970-2016. We use MATLAB software to compute this method. The algorithm of Saxena-Easo fuzzy time series doesn’t need stationarity like conventional forecasting method, capable of dealing with the value of time series which are linguistic and has the advantage of reducing the calculation, time and simplifying the calculation process. Generally it’s focus on percentage change as the universe discourse, interval partition and defuzzification. The result indicate that between the actual data and the forecast data are close enough with Root Mean Square Error (RMSE) = 1.5289.

  18. The Algebra of a q-Analogue of Multiple Harmonic Series

    Directory of Open Access Journals (Sweden)

    Yoshihiro Takeyama

    2013-10-01

    Full Text Available We introduce an algebra which describes the multiplication structure of a family of q-series containing a q-analogue of multiple zeta values. The double shuffle relations are formulated in our framework. They contain a q-analogue of Hoffman's identity for multiple zeta values. We also discuss the dimension of the space spanned by the linear relations realized in our algebra.

  19. Statistical inference for classification of RRIM clone series using near IR reflectance properties

    Science.gov (United States)

    Ismail, Faridatul Aima; Madzhi, Nina Korlina; Hashim, Hadzli; Abdullah, Noor Ezan; Khairuzzaman, Noor Aishah; Azmi, Azrie Faris Mohd; Sampian, Ahmad Faiz Mohd; Harun, Muhammad Hafiz

    2015-08-01

    RRIM clone is a rubber breeding series produced by RRIM (Rubber Research Institute of Malaysia) through "rubber breeding program" to improve latex yield and producing clones attractive to farmers. The objective of this work is to analyse measurement of optical sensing device on latex of selected clone series. The device using transmitting NIR properties and its reflectance is converted in terms of voltage. The obtained reflectance index value via voltage was analyzed using statistical technique in order to find out the discrimination among the clones. From the statistical results using error plots and one-way ANOVA test, there is an overwhelming evidence showing discrimination of RRIM 2002, RRIM 2007 and RRIM 3001 clone series with p value = 0.000. RRIM 2008 cannot be discriminated with RRIM 2014; however both of these groups are distinct from the other clones.

  20. Visibility graph analysis on quarterly macroeconomic series of China based on complex network theory

    Science.gov (United States)

    Wang, Na; Li, Dong; Wang, Qiwen

    2012-12-01

    The visibility graph approach and complex network theory provide a new insight into time series analysis. The inheritance of the visibility graph from the original time series was further explored in the paper. We found that degree distributions of visibility graphs extracted from Pseudo Brownian Motion series obtained by the Frequency Domain algorithm exhibit exponential behaviors, in which the exponential exponent is a binomial function of the Hurst index inherited in the time series. Our simulations presented that the quantitative relations between the Hurst indexes and the exponents of degree distribution function are different for different series and the visibility graph inherits some important features of the original time series. Further, we convert some quarterly macroeconomic series including the growth rates of value-added of three industry series and the growth rates of Gross Domestic Product series of China to graphs by the visibility algorithm and explore the topological properties of graphs associated from the four macroeconomic series, namely, the degree distribution and correlations, the clustering coefficient, the average path length, and community structure. Based on complex network analysis we find degree distributions of associated networks from the growth rates of value-added of three industry series are almost exponential and the degree distributions of associated networks from the growth rates of GDP series are scale free. We also discussed the assortativity and disassortativity of the four associated networks as they are related to the evolutionary process of the original macroeconomic series. All the constructed networks have “small-world” features. The community structures of associated networks suggest dynamic changes of the original macroeconomic series. We also detected the relationship among government policy changes, community structures of associated networks and macroeconomic dynamics. We find great influences of government

  1. Boundary value problems and partial differential equations

    CERN Document Server

    Powers, David L

    2005-01-01

    Boundary Value Problems is the leading text on boundary value problems and Fourier series. The author, David Powers, (Clarkson) has written a thorough, theoretical overview of solving boundary value problems involving partial differential equations by the methods of separation of variables. Professors and students agree that the author is a master at creating linear problems that adroitly illustrate the techniques of separation of variables used to solve science and engineering.* CD with animations and graphics of solutions, additional exercises and chapter review questions* Nearly 900 exercises ranging in difficulty* Many fully worked examples

  2. Magnetic Field Emission Comparison for Series-Parallel and Series-Series Wireless Power Transfer to Vehicles – PART 2/2

    DEFF Research Database (Denmark)

    Batra, Tushar; Schaltz, Erik

    2014-01-01

    Series-series and series-parallel topologies are the most favored topologies for design of wireless power transfer system for vehicle applications. The series-series topology has the advantage of reflecting only the resistive part on the primary side. On the other hand, the current source output...... characteristics of the series-parallel topology are more suited for the battery of the vehicle. This paper compares the two topologies in terms of magnetic emissions to the surroundings for the same input power, primary current, quality factor and inductors. Theoretical and simulation results show that the series...

  3. Light Scattering Reviews, Vol 6 Light Scattering and Remote Sensing of Atmosphere and Surface

    CERN Document Server

    Kokhanovsky, Alexander A

    2012-01-01

    This is the next volume in series of Light Scattering Reviews. Volumes 1-5 have already been printed by Springer. The volume is composed of several papers ( usually, 10) of leading researchers in the respective field. The main focus of this book is light scattering, radiative transfer and optics of snow.

  4. Research on Kalman Filtering Algorithm for Deformation Information Series of Similar Single-Difference Model

    Institute of Scientific and Technical Information of China (English)

    L(U) Wei-cai; XU Shao-quan

    2004-01-01

    Using similar single-difference methodology(SSDM) to solve the deformation values of the monitoring points, there is unstability of the deformation information series, at sometimes.In order to overcome this shortcoming, Kalman filtering algorithm for this series is established,and its correctness and validity are verified with the test data obtained on the movable platform in plane. The results show that Kalman filtering can improve the correctness, reliability and stability of the deformation information series.

  5. The Diffusion and Value of Healthcare Information Technology

    CERN Document Server

    Bower, Anthony G

    2006-01-01

    Through a series of interviews and database analyses, and an extensive literature review and synthesis, this report characterizes the diffusion of use of electronic health records and places a value on that diffusion.

  6. Data imputation analysis for Cosmic Rays time series

    Science.gov (United States)

    Fernandes, R. C.; Lucio, P. S.; Fernandez, J. H.

    2017-05-01

    The occurrence of missing data concerning Galactic Cosmic Rays time series (GCR) is inevitable since loss of data is due to mechanical and human failure or technical problems and different periods of operation of GCR stations. The aim of this study was to perform multiple dataset imputation in order to depict the observational dataset. The study has used the monthly time series of GCR Climax (CLMX) and Roma (ROME) from 1960 to 2004 to simulate scenarios of 10%, 20%, 30%, 40%, 50%, 60%, 70%, 80% and 90% of missing data compared to observed ROME series, with 50 replicates. Then, the CLMX station as a proxy for allocation of these scenarios was used. Three different methods for monthly dataset imputation were selected: AMÉLIA II - runs the bootstrap Expectation Maximization algorithm, MICE - runs an algorithm via Multivariate Imputation by Chained Equations and MTSDI - an Expectation Maximization algorithm-based method for imputation of missing values in multivariate normal time series. The synthetic time series compared with the observed ROME series has also been evaluated using several skill measures as such as RMSE, NRMSE, Agreement Index, R, R2, F-test and t-test. The results showed that for CLMX and ROME, the R2 and R statistics were equal to 0.98 and 0.96, respectively. It was observed that increases in the number of gaps generate loss of quality of the time series. Data imputation was more efficient with MTSDI method, with negligible errors and best skill coefficients. The results suggest a limit of about 60% of missing data for imputation, for monthly averages, no more than this. It is noteworthy that CLMX, ROME and KIEL stations present no missing data in the target period. This methodology allowed reconstructing 43 time series.

  7. Transmission Line Resonator Segmented with Series Capacitors

    DEFF Research Database (Denmark)

    Zhurbenko, Vitaliy; Boer, Vincent; Petersen, Esben Thade

    2016-01-01

    Transmission line resonators are often used as coils in high field MRI. Due to distributed nature of such resonators, coils based on them produce inhomogeneous field. This work investigates application of series capacitors to improve field homogeneity along the resonator. The equations for optimal...... values of evenly distributed capacitors are presented. The performances of the segmented resonator and a regular transmission line resonator are compared....

  8. A case series study on complications after breast augmentation with Macrolane™.

    Science.gov (United States)

    Becchere, M P; Farace, F; Dessena, L; Marongiu, Francesco; Bulla, A; Simbula, L; Meloni, G B; Rubino, C

    2013-04-01

    The use of Macrolane™ seems to have several advantages compared to the other standard methods for breast augmentation: it is faster, less invasive, and requires only local anesthesia. Nevertheless, various complications associated with the use of Macrolane™ have been described, e.g., encapsulated lumps in breast tissue, infection, and parenchymal fibrosis. We report the results of our case series study on the clinical and imaging evaluations of patients who came to our attention after breast augmentation with Macrolane™ injection and evaluate the effect of this treatment on breast cancer screening procedures. Between September 2009 and July 2010, seven patients, treated elsewhere with intramammary Macrolane™ injection for cosmetic purposes, presented to our institution complaining of breast pain. In all patients, Macrolane™ had been injected under local anesthesia in the retromammary space through a surgical cannula. On mammography, nodules appeared as gross lobulated radiopacities with polycyclic contours. On breast ultrasound, the nodules showed hypo-anaechogenic cystlike features. In all cases, image analysis by the radiologist was hindered by the presence of the implanted substance, which did not allow the complete inspection of the whole breast tissue. From our experience, although safe in other areas, injection of Macrolane™ into breast tissue cannot be recommended at this time. Our study, along with other reports, supports the need to start a clinical trial on the use of injectable fillers in the breast to validate their safety and effectiveness. This journal requires that authors assign a level of evidence to each article. For a full description of these Evidence-Based Medicine ratings, please refer to the Table of Contents or the online Instructions to Authors www.springer.com/00266 .

  9. Reconfiguring global pharmaceutical value networks through targeted technology interventions

    OpenAIRE

    Harrington, Tomas Seosamh; Phillips, MA; Srai, Jagjit Singh

    2016-01-01

    Targeting a series of advanced manufacturing technology (AMT) ‘interventions’ provides the potential for significant step changes across the pharmaceutical value chain, from early stage ‘system discovery’ and clinical trials, through to novel service supply models. This research explores future value network configurations which, when aligned with disruptive shifts in technology (process and digital), may enable alternative routes to medicines production and the delivery of additional value t...

  10. Evolution of the Sunspot Number and Solar Wind B Time Series

    Science.gov (United States)

    Cliver, Edward W.; Herbst, Konstantin

    2018-03-01

    The past two decades have witnessed significant changes in our knowledge of long-term solar and solar wind activity. The sunspot number time series (1700-present) developed by Rudolf Wolf during the second half of the 19th century was revised and extended by the group sunspot number series (1610-1995) of Hoyt and Schatten during the 1990s. The group sunspot number is significantly lower than the Wolf series before ˜1885. An effort from 2011-2015 to understand and remove differences between these two series via a series of workshops had the unintended consequence of prompting several alternative constructions of the sunspot number. Thus it has been necessary to expand and extend the sunspot number reconciliation process. On the solar wind side, after a decade of controversy, an ISSI International Team used geomagnetic and sunspot data to obtain a high-confidence time series of the solar wind magnetic field strength (B) from 1750-present that can be compared with two independent long-term (> ˜600 year) series of annual B-values based on cosmogenic nuclides. In this paper, we trace the twists and turns leading to our current understanding of long-term solar and solar wind activity.

  11. Rethinking Value in the Bio-economy

    Science.gov (United States)

    2016-01-01

    Current debates in science and technology studies emphasize that the bio-economy—or, the articulation of capitalism and biotechnology—is built on notions of commodity production, commodification, and materiality, emphasizing that it is possible to derive value from body parts, molecular and cellular tissues, biological processes, and so on. What is missing from these perspectives, however, is consideration of the political-economic actors, knowledges, and practices involved in the creation and management of value. As part of a rethinking of value in the bio-economy, this article analyzes three key political-economic processes: financialization, capitalization, and assetization. In doing so, it argues that value is managed as part of a series of valuation practices, it is not inherent in biological materialities. PMID:28458406

  12. Computations of Eisenstein series on Fuchsian groups

    Science.gov (United States)

    Avelin, Helen

    2008-09-01

    We present numerical investigations of the value distribution and distribution of Fourier coefficients of the Eisenstein series E(z;s) on arithmetic and non-arithmetic Fuchsian groups. Our numerics indicate a Gaussian limit value distribution for a real-valued rotation of E(z;s) as operatorname{Re} sD1/2 , operatorname{Im} sto infty and also, on non-arithmetic groups, a complex Gaussian limit distribution for E(z;s) when operatorname{Re} s>1/2 near 1/2 and operatorname{Im} sto infty , at least if we allow operatorname{Re} sto 1/2 at some rate. Furthermore, on non-arithmetic groups and for fixed s with operatorname{Re} s ge 1/2 near 1/2 , our numerics indicate a Gaussian limit distribution for the appropriately normalized Fourier coefficients.

  13. Kriging Methodology and Its Development in Forecasting Econometric Time Series

    Directory of Open Access Journals (Sweden)

    Andrej Gajdoš

    2017-03-01

    Full Text Available One of the approaches for forecasting future values of a time series or unknown spatial data is kriging. The main objective of the paper is to introduce a general scheme of kriging in forecasting econometric time series using a family of linear regression time series models (shortly named as FDSLRM which apply regression not only to a trend but also to a random component of the observed time series. Simultaneously performing a Monte Carlo simulation study with a real electricity consumption dataset in the R computational langure and environment, we investigate the well-known problem of “negative” estimates of variance components when kriging predictions fail. Our following theoretical analysis, including also the modern apparatus of advanced multivariate statistics, gives us the formulation and proof of a general theorem about the explicit form of moments (up to sixth order for a Gaussian time series observation. This result provides a basis for further theoretical and computational research in the kriging methodology development.

  14. Analysis of series resistance effects on forward I - V and C - V characteristics of mis type diodes

    International Nuclear Information System (INIS)

    Altindal, S.; Tekeli, Z.; Karadeniz, S.; Tugluoglu, N.; Ercan, I.

    2002-01-01

    In order to determine the series resistance R s , we have followed Lie et al., Cheung et al. and Kang et al., from the plot of I vs dV/dLn(I) which was linear curve over a wide range of current values at each temperature. The values of Rs were obtained from the slope of the linear parts of the curves and then the series resistance at each temperature has been evaluated at Ln(I) vs (V-IR s ) curves. The curves are linear over a wide range of voltage. The most reliable values of ideality factor n and reverse saturation current Is were then determined. In addition to role of series resistance on the C-V and G-V characteristics of diode have been investigated. Both C-V and G-V measurements show that the measured capacitance and conductance seriously varies with applied bias and frequency due to presence of R s . The density of interface states, barrier height and series resistance from the forward bias I-V characteristics using this method agrees very well with that obtained from the capacitance technique. It is clear that ignoring the series resistance (device with high series resistance) can lead to significant errors in the analysis of the I-V-T, C-V-f and G-V-f characteristics

  15. Comparison of different Methods for Univariate Time Series Imputation in R

    OpenAIRE

    Moritz, Steffen; Sardá, Alexis; Bartz-Beielstein, Thomas; Zaefferer, Martin; Stork, Jörg

    2015-01-01

    Missing values in datasets are a well-known problem and there are quite a lot of R packages offering imputation functions. But while imputation in general is well covered within R, it is hard to find functions for imputation of univariate time series. The problem is, most standard imputation techniques can not be applied directly. Most algorithms rely on inter-attribute correlations, while univariate time series imputation needs to employ time dependencies. This paper provides an overview of ...

  16. Theta series, wall-crossing and quantum dilogarithm identities

    CERN Document Server

    Alexandrov, Sergei

    2016-01-01

    Motivated by mathematical structures which arise in string vacua and gauge theories with N=2 supersymmetry, we study the properties of certain generalized theta series which appear as Fourier coefficients of functions on a twisted torus. In Calabi-Yau string vacua, such theta series encode instanton corrections from $k$ Neveu-Schwarz five-branes. The theta series are determined by vector-valued wave-functions, and in this work we obtain the transformation of these wave-functions induced by Kontsevich-Soibelman symplectomorphisms. This effectively provides a quantum version of these transformations, where the quantization parameter is inversely proportional to the five-brane charge $k$. Consistency with wall-crossing implies a new five-term relation for Faddeev's quantum dilogarithm $\\Phi_b$ at $b=1$, which we prove. By allowing the torus to be non-commutative, we obtain a more general five-term relation valid for arbitrary $b$ and $k$, which may be relevant for the physics of five-branes at finite chemical po...

  17. Stochastic generation of hourly wind speed time series

    International Nuclear Information System (INIS)

    Shamshad, A.; Wan Mohd Ali Wan Hussin; Bawadi, M.A.; Mohd Sanusi, S.A.

    2006-01-01

    In the present study hourly wind speed data of Kuala Terengganu in Peninsular Malaysia are simulated by using transition matrix approach of Markovian process. The wind speed time series is divided into various states based on certain criteria. The next wind speed states are selected based on the previous states. The cumulative probability transition matrix has been formed in which each row ends with 1. Using the uniform random numbers between 0 and 1, a series of future states is generated. These states have been converted to the corresponding wind speed values using another uniform random number generator. The accuracy of the model has been determined by comparing the statistical characteristics such as average, standard deviation, root mean square error, probability density function and autocorrelation function of the generated data to those of the original data. The generated wind speed time series data is capable to preserve the wind speed characteristics of the observed data

  18. Multiresolution analysis of Bursa Malaysia KLCI time series

    Science.gov (United States)

    Ismail, Mohd Tahir; Dghais, Amel Abdoullah Ahmed

    2017-05-01

    In general, a time series is simply a sequence of numbers collected at regular intervals over a period. Financial time series data processing is concerned with the theory and practice of processing asset price over time, such as currency, commodity data, and stock market data. The primary aim of this study is to understand the fundamental characteristics of selected financial time series by using the time as well as the frequency domain analysis. After that prediction can be executed for the desired system for in sample forecasting. In this study, multiresolution analysis which the assist of discrete wavelet transforms (DWT) and maximal overlap discrete wavelet transform (MODWT) will be used to pinpoint special characteristics of Bursa Malaysia KLCI (Kuala Lumpur Composite Index) daily closing prices and return values. In addition, further case study discussions include the modeling of Bursa Malaysia KLCI using linear ARIMA with wavelets to address how multiresolution approach improves fitting and forecasting results.

  19. Linking the Negative Binomial and Logarithmic Series Distributions via their Associated Series

    OpenAIRE

    SADINLE, MAURICIO

    2008-01-01

    The negative binomial distribution is associated to the series obtained by taking derivatives of the logarithmic series. Conversely, the logarithmic series distribution is associated to the series found by integrating the series associated to the negative binomial distribution. The parameter of the number of failures of the negative binomial distribution is the number of derivatives needed to obtain the negative binomial series from the logarithmic series. The reasoning in this article could ...

  20. Off-line tracking of series parameters in distribution systems using AMI data

    Energy Technology Data Exchange (ETDEWEB)

    Williams, Tess L.; Sun, Yannan; Schneider, Kevin

    2016-05-01

    Electric distribution systems have historically lacked measurement points, and equipment is often operated to its failure point, resulting in customer outages. The widespread deployment of sensors at the distribution level is enabling observability. This paper presents an off-line parameter value tracking procedure that takes advantage of the increasing number of measurement devices being deployed at the distribution level to estimate changes in series impedance parameter values over time. The tracking of parameter values enables non-diurnal and non-seasonal change to be flagged for investigation. The presented method uses an unbalanced Distribution System State Estimation (DSSE) and a measurement residual-based parameter estimation procedure. Measurement residuals from multiple measurement snapshots are combined in order to increase the effective local redundancy and improve the robustness of the calculations in the presence of measurement noise. Data from devices on the primary distribution system and from customer meters, via an AMI system, form the input data set. Results of simulations on the IEEE 13-Node Test Feeder are presented to illustrate the proposed approach applied to changes in series impedance parameters. A 5% change in series resistance elements can be detected in the presence of 2% measurement error when combining less than 1 day of measurement snapshots into a single estimate.

  1. Computational Biology Support: RECOMB Conference Series (Conference Support)

    Energy Technology Data Exchange (ETDEWEB)

    Michael Waterman

    2006-06-15

    This funding was support for student and postdoctoral attendance at the Annual Recomb Conference from 2001 to 2005. The RECOMB Conference series was founded in 1997 to provide a scientific forum for theoretical advances in computational biology and their applications in molecular biology and medicine. The conference series aims at attracting research contributions in all areas of computational molecular biology. Typical, but not exclusive, the topics of interest are: Genomics, Molecular sequence analysis, Recognition of genes and regulatory elements, Molecular evolution, Protein structure, Structural genomics, Gene Expression, Gene Networks, Drug Design, Combinatorial libraries, Computational proteomics, and Structural and functional genomics. The origins of the conference came from the mathematical and computational side of the field, and there remains to be a certain focus on computational advances. However, the effective use of computational techniques to biological innovation is also an important aspect of the conference. The conference had a growing number of attendees, topping 300 in recent years and often exceeding 500. The conference program includes between 30 and 40 contributed papers, that are selected by a international program committee with around 30 experts during a rigorous review process rivaling the editorial procedure for top-rate scientific journals. In previous years papers selection has been made from up to 130--200 submissions from well over a dozen countries. 10-page extended abstracts of the contributed papers are collected in a volume published by ACM Press and Springer, and are available at the conference. Full versions of a selection of the papers are published annually in a special issue of the Journal of Computational Biology devoted to the RECOMB Conference. A further point in the program is a lively poster session. From 120-300 posters have been presented each year at RECOMB 2000. One of the highlights of each RECOMB conference is a

  2. Exact series expansions, recurrence relations, properties and integrals of the generalized exponential integral functions

    International Nuclear Information System (INIS)

    Altac, Zekeriya

    2007-01-01

    Generalized exponential integral functions (GEIF) are encountered in multi-dimensional thermal radiative transfer problems in the integral equation kernels. Several series expansions for the first-order generalized exponential integral function, along with a series expansion for the general nth order GEIF, are derived. The convergence issues of these series expansions are investigated numerically as well as theoretically, and a recurrence relation which does not require derivatives of the GEIF is developed. The exact series expansions of the two dimensional cylindrical and/or two-dimensional planar integral kernels as well as their spatial moments have been explicitly derived and compared with numerical values

  3. Notes on economic time series analysis system theoretic perspectives

    CERN Document Server

    Aoki, Masanao

    1983-01-01

    In seminars and graduate level courses I have had several opportunities to discuss modeling and analysis of time series with economists and economic graduate students during the past several years. These experiences made me aware of a gap between what economic graduate students are taught about vector-valued time series and what is available in recent system literature. Wishing to fill or narrow the gap that I suspect is more widely spread than my personal experiences indicate, I have written these notes to augment and reor­ ganize materials I have given in these courses and seminars. I have endeavored to present, in as much a self-contained way as practicable, a body of results and techniques in system theory that I judge to be relevant and useful to economists interested in using time series in their research. I have essentially acted as an intermediary and interpreter of system theoretic results and perspectives in time series by filtering out non-essential details, and presenting coherent accounts of wha...

  4. Design and Implementation of a Professional Development Course Series.

    Science.gov (United States)

    Welch, Beth; Spooner, Joshua J; Tanzer, Kim; Dintzner, Matthew R

    2017-12-01

    Objective. To design and implement a longitudinal course series focused on professional development and professional identity formation in pharmacy students at Western New England University. Methods. A four-year, theme-based course series was designed to sequentially and longitudinally impart the values, attributes, and characteristics of a professional pharmacist. Requirements of the course include: goal planning and reflective assignments, submission of "Best Works," attendance at professional meetings, completion of service hours, annual completion of a Pharmacy Professionalism Instrument, attendance at Dean's Seminar, participation in roundtable discussions, and maintenance of an electronic portfolio. Though the Professional Development course series carries no credit, these courses are progression requirements and students are assessed on a pass/fail basis. Results. Course pass rates in the 2015-2016 academic year for all four classes were 99% to 100%, suggesting the majority of students take professional development seriously and are achieving the intended outcomes of the courses. Conclusion. A professional development course series was designed and implemented in the new Doctor of Pharmacy program at Western New England University to enhance the professional identity formation of students.

  5. Magnetic Field Emission Comparison for Series-Parallel and Series-Series Wireless Power Transfer to Vehicles – PART 1/2

    DEFF Research Database (Denmark)

    Batra, Tushar; Schaltz, Erik

    2014-01-01

    Resonant circuits of wireless power transfer system can be designed in four possible ways by placing the primary and secondary capacitor in a series or parallel order with respect to the corresponding inductor. The two topologies series-parallel and series-series under investigation have been...... already compared in terms of their output behavior (current or voltage source) and reflection of the secondary impedance on the primary side. In this paper it is shown that for the same power rating series-parallel topology emits lesser magnetic fields to the surroundings than its series...

  6. RankExplorer: Visualization of Ranking Changes in Large Time Series Data.

    Science.gov (United States)

    Shi, Conglei; Cui, Weiwei; Liu, Shixia; Xu, Panpan; Chen, Wei; Qu, Huamin

    2012-12-01

    For many applications involving time series data, people are often interested in the changes of item values over time as well as their ranking changes. For example, people search many words via search engines like Google and Bing every day. Analysts are interested in both the absolute searching number for each word as well as their relative rankings. Both sets of statistics may change over time. For very large time series data with thousands of items, how to visually present ranking changes is an interesting challenge. In this paper, we propose RankExplorer, a novel visualization method based on ThemeRiver to reveal the ranking changes. Our method consists of four major components: 1) a segmentation method which partitions a large set of time series curves into a manageable number of ranking categories; 2) an extended ThemeRiver view with embedded color bars and changing glyphs to show the evolution of aggregation values related to each ranking category over time as well as the content changes in each ranking category; 3) a trend curve to show the degree of ranking changes over time; 4) rich user interactions to support interactive exploration of ranking changes. We have applied our method to some real time series data and the case studies demonstrate that our method can reveal the underlying patterns related to ranking changes which might otherwise be obscured in traditional visualizations.

  7. Numerical Solutions of Fifth Order Boundary Value Problems Using

    African Journals Online (AJOL)

    Dr A.B.Ahmed

    1Department of Mathematics Delta State University, Abraka, Nigeria. 2Department of ..... International Journal of Computational. Mathematics and ... Value Problems using Power Series Approximation Method.Applied. Mathematics,. 7,. 1215-.

  8. Algorithm for Compressing Time-Series Data

    Science.gov (United States)

    Hawkins, S. Edward, III; Darlington, Edward Hugo

    2012-01-01

    An algorithm based on Chebyshev polynomials effects lossy compression of time-series data or other one-dimensional data streams (e.g., spectral data) that are arranged in blocks for sequential transmission. The algorithm was developed for use in transmitting data from spacecraft scientific instruments to Earth stations. In spite of its lossy nature, the algorithm preserves the information needed for scientific analysis. The algorithm is computationally simple, yet compresses data streams by factors much greater than two. The algorithm is not restricted to spacecraft or scientific uses: it is applicable to time-series data in general. The algorithm can also be applied to general multidimensional data that have been converted to time-series data, a typical example being image data acquired by raster scanning. However, unlike most prior image-data-compression algorithms, this algorithm neither depends on nor exploits the two-dimensional spatial correlations that are generally present in images. In order to understand the essence of this compression algorithm, it is necessary to understand that the net effect of this algorithm and the associated decompression algorithm is to approximate the original stream of data as a sequence of finite series of Chebyshev polynomials. For the purpose of this algorithm, a block of data or interval of time for which a Chebyshev polynomial series is fitted to the original data is denoted a fitting interval. Chebyshev approximation has two properties that make it particularly effective for compressing serial data streams with minimal loss of scientific information: The errors associated with a Chebyshev approximation are nearly uniformly distributed over the fitting interval (this is known in the art as the "equal error property"); and the maximum deviations of the fitted Chebyshev polynomial from the original data have the smallest possible values (this is known in the art as the "min-max property").

  9. Postmaterialist Values and Adult Political Learning. Intracohort Value Change in Western Europe

    Directory of Open Access Journals (Sweden)

    Raül Tormos

    2012-01-01

    Full Text Available Research on value change and stability tends to underline the importance of generational effects, Inglehart's theory of post-materialism being an example of this. According to his theory, formative experiences shape the values of each age-cohort, and social change takes place progressively due to the force of generational replacement. This article analyzes survey data covering a wider period of observations than the one Inglehart used to draw his conclusions. By applying time series techniques, I find signifi cant changes within each generation over time. I show how an important adult learning process in the field of post-materialist values has taken place, which has been neglected by the empirical literature. Contrary to Inglehart's point of view, I conclude that period effects are not just minor short-term infl uences affecting the "normal" change due to generational replacement, but a systematic intracohort trend linked to the European economic prosperity of recent decades.

  10. The Value of Children: A Cross-National Study, Volume Two. Philippines.

    Science.gov (United States)

    Bulatao, Rodolfo A.

    This volume, second in a series of seven reports of the Value of Children Project, discusses results of the survey in the Philippines. The study identifies major values and disvalues that Filipino parents attach to children. It also examines characteristics of parents that are related to values and disvalues. The document is presented in seven…

  11. On reliability of singular-value decomposition in attractor reconstruction

    International Nuclear Information System (INIS)

    Palus, M.; Dvorak, I.

    1990-12-01

    Applicability of singular-value decomposition for reconstructing the strange attractor from one-dimensional chaotic time series, proposed by Broomhead and King, is extensively tested and discussed. Previously published doubts about its reliability are confirmed: singular-value decomposition, by nature a linear method, is only of a limited power when nonlinear structures are studied. (author). 29 refs, 9 figs

  12. Summation of series

    CERN Document Server

    Jolley, LB W

    2004-01-01

    Over 1,100 common series, all grouped for easy reference. Arranged by category, these series include arithmetical and geometrical progressions, powers and products of natural numbers, figurate and polygonal numbers, inverse natural numbers, exponential and logarithmic series, binomials, simple inverse products, factorials, trigonometrical and hyperbolic expansions, and additional series. 1961 edition.

  13. Resolving issues concerning Eskdalemuir geomagnetic hourly values

    Directory of Open Access Journals (Sweden)

    S. Macmillan

    2011-02-01

    Full Text Available The hourly values of the geomagnetic field from 1911 to 1931 derived from measurements made at Eskdalemuir observatory in the UK, and available online from the World Data Centre for Geomagnetism at http://www.wdc.bgs.ac.uk/, have now been corrected. Previously they were 2-point averaged and transformed from the original north, east and vertical down values in the tables in the observatory yearbooks. This paper documents the course of events from discovering the post-processing done to the data to the final resolution of the problem. As it was through the development of a new index, the Inter-Hour Variability index, that this post-processing came to light, we provide a revised series of this index for Eskdalemuir and compare it with that from another European observatory. Conclusions of studies concerning long-term magnetic field variability and inferred solar variability, whilst not necessarily consistent with one another, are not obviously invalidated by the incorrect hourly values from Eskdalemuir. This series of events illustrates the challenges that lie ahead in removing any remaining errors and inconsistencies in the data holdings of different World Data Centres.

  14. Estimation of value at risk and conditional value at risk using normal mixture distributions model

    Science.gov (United States)

    Kamaruzzaman, Zetty Ain; Isa, Zaidi

    2013-04-01

    Normal mixture distributions model has been successfully applied in financial time series analysis. In this paper, we estimate the return distribution, value at risk (VaR) and conditional value at risk (CVaR) for monthly and weekly rates of returns for FTSE Bursa Malaysia Kuala Lumpur Composite Index (FBMKLCI) from July 1990 until July 2010 using the two component univariate normal mixture distributions model. First, we present the application of normal mixture distributions model in empirical finance where we fit our real data. Second, we present the application of normal mixture distributions model in risk analysis where we apply the normal mixture distributions model to evaluate the value at risk (VaR) and conditional value at risk (CVaR) with model validation for both risk measures. The empirical results provide evidence that using the two components normal mixture distributions model can fit the data well and can perform better in estimating value at risk (VaR) and conditional value at risk (CVaR) where it can capture the stylized facts of non-normality and leptokurtosis in returns distribution.

  15. Solutions of diffusion equations in two-dimensional cylindrical geometry by series expansions

    International Nuclear Information System (INIS)

    Ohtani, Nobuo

    1976-01-01

    A solution of the multi-group multi-regional diffusion equation in two-dimensional cylindrical (rho-z) geometry is obtained in the form of a regionwise double series composed of Bessel and trigonometrical functions. The diffusion equation is multiplied by weighting functions, which satisfy the homogeneous part of the diffusion equation, and the products are integrated over the region for obtaining the equations to determine the fluxes and their normal derivatives at the region boundaries. Multiplying the diffusion equation by each function of the set used for the flux expansion, then integrating the products, the coefficients of the double series of the flux inside each region are calculated using the boundary values obtained above. Since the convergence of the series thus obtained is slow especially near the region boundaries, a method for improving the convergence has been developed. The double series of the flux is separated into two parts. The normal derivative at the region boundary of the first part is zero, and that of the second part takes the value which is obtained in the first stage of this method. The second part is replaced by a continuous function, and the flux is represented by the sum of the continuous function and the double series. A sample critical problem of a two-group two-region system is numerically studied. The results show that the present method yields very accurately the flux integrals in each region with only a small number of expansion terms. (auth.)

  16. Øjet springer over muren

    DEFF Research Database (Denmark)

    Kragh-Müller, Grethe

    2010-01-01

    artiklen omhandler "rummet som den tredje pædagog". Her beskrives, hvordan man i det pædagogiske arbejde i Reggio Emilia i Italien lægger vægt på indrette gode læringsmiljøer for børn. Der er en beskrivelse af, hvordan læring tænkes og tilrettelægges ud fra et konstruktivistisk grundlag, hvor der...... lægges vægt på en forståelse af, at barnet aktivt konstruerer sin viden. Endelig kan der læses om den kritik, som Loris Malaguzzi, en af grundlæggerne af og inspirator for det pædagogiske arbejde i Reggio Emilie, har givet af traditionel skoleundervisning....

  17. Springer handbook of ocean engineering

    CERN Document Server

    Xiros, Nikolaos

    2016-01-01

    The handbook is the definitive reference for the interdisciplinary field that is ocean engineering. It integrates the coverage of fundamental and applied material and encompasses a diverse spectrum of systems, concepts and operations in the maritime environment, as well as providing a comprehensive update on contemporary, leading-edge ocean technologies. Coverage includes but is not limited to; an overview of ocean science, ocean signals and instrumentation, coastal structures, developments in ocean energy technologies, and ocean vehicles and automation. The handbook will be of interest to practitioners in a range of offshore industries and naval establishments as well as academic researchers and graduate students in ocean, coastal, offshore, and marine engineering and naval architecture.

  18. The Springer index of viruses

    National Research Council Canada - National Science Library

    Buchen-Osmond, Cornelia; Darai, Gholamreza; Tidona, Christian A

    2002-01-01

    .... Each of the 241 taxonomically ordered chapters includes detailed information on individual genus members, historical events, virion morphology, genome properties, replication strategy, properties...

  19. The Springer index of viruses

    National Research Council Canada - National Science Library

    Buchen-Osmond, Cornelia; Darai, Gholamreza; Tidona, Christian A

    2002-01-01

    ... of individual transcripts and proteins, sequence accession numbers, biological properties, diseases, recombinant vector constructs, vaccine strains, key references, as well as a high-resolution particle...

  20. The Springer index of viruses

    National Research Council Canada - National Science Library

    Buchen-Osmond, Cornelia; Darai, Gholamreza; Tidona, Christian A

    2002-01-01

    ... image and a drawing of the genome organization. Its high content of easily accessible detail information makes this Encyclopedic Reference an indispensable tool for both researchers and lecturers" [publisher's web site].

  1. Metabonomics and Intensive Care

    OpenAIRE

    Antcliffe, D; Gordon, AC

    2016-01-01

    This article is one of ten reviews selected from the Annual Update in Intensive Care and Emergency medicine 2016. Other selected articles can be found online at http://www.biomedcentral.com/collections/annualupdate2016. Further information about the Annual Update in Intensive Care and Emergency Medicine is available from http://www.springer.com/series/8901.

  2. Change detection in a series of Sentinel-1 SAR data

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Conradsen, Knut; Skriver, Henning

    2017-01-01

    Based on an omnibus likelihood ratio test statistic for the equality of several variance-covariance matrices following the complex Wishart distribution with an associated p-value and a factorization of this test statistic, change analysis in a time series of seven multilook, dual polarization...

  3. Which DTW Method Applied to Marine Univariate Time Series Imputation

    OpenAIRE

    Phan , Thi-Thu-Hong; Caillault , Émilie; Lefebvre , Alain; Bigand , André

    2017-01-01

    International audience; Missing data are ubiquitous in any domains of applied sciences. Processing datasets containing missing values can lead to a loss of efficiency and unreliable results, especially for large missing sub-sequence(s). Therefore, the aim of this paper is to build a framework for filling missing values in univariate time series and to perform a comparison of different similarity metrics used for the imputation task. This allows to suggest the most suitable methods for the imp...

  4. Uranium series disequilibrium studies at the Broubster analogue site

    International Nuclear Information System (INIS)

    Longworth, G.; Ivanovich, M.; Wilkins, M.A.

    1990-11-01

    Uranium series measurements at a natural analogue site at Broubster, Caithness have been used to investigate radionuclide migration over periods ranging from several hundred to 10 6 years. The measured values for the uranium concentration and activity values 234 U/ 238 U and 230 Th/ 234 U indicate that the geochemical system is more complicated than that originally proposed of uranium dispersion and water transport into a peat bog. There appears to be little thorium mobility although there is evidence for an appreciable fraction of thorium on the colloidal phase. (author)

  5. A series solution of the Falkner-Skan equation using the crocco-wang transformation

    Science.gov (United States)

    Asaithambi, Asai

    A direct series solution for the Falkner-Skan equation is obtained by first transforming the problem using the Crocco-Wang transformation. The transformation converts the third-order problem to a second-order two-point boundary value problem. The method first constructs a series involving the unknown skin-friction coefficient α. Then, α is determined by using the secant method or Newton’s method. The derivative needed for Newton’s method is also computed using a series derived from the transformed differential equation. The method is validated by solving the Falkner-Skan equation for several cases reported previously in the literature.

  6. Improving cluster-based missing value estimation of DNA microarray data.

    Science.gov (United States)

    Brás, Lígia P; Menezes, José C

    2007-06-01

    We present a modification of the weighted K-nearest neighbours imputation method (KNNimpute) for missing values (MVs) estimation in microarray data based on the reuse of estimated data. The method was called iterative KNN imputation (IKNNimpute) as the estimation is performed iteratively using the recently estimated values. The estimation efficiency of IKNNimpute was assessed under different conditions (data type, fraction and structure of missing data) by the normalized root mean squared error (NRMSE) and the correlation coefficients between estimated and true values, and compared with that of other cluster-based estimation methods (KNNimpute and sequential KNN). We further investigated the influence of imputation on the detection of differentially expressed genes using SAM by examining the differentially expressed genes that are lost after MV estimation. The performance measures give consistent results, indicating that the iterative procedure of IKNNimpute can enhance the prediction ability of cluster-based methods in the presence of high missing rates, in non-time series experiments and in data sets comprising both time series and non-time series data, because the information of the genes having MVs is used more efficiently and the iterative procedure allows refining the MV estimates. More importantly, IKNN has a smaller detrimental effect on the detection of differentially expressed genes.

  7. Time Series Analysis of 3D Coordinates Using Nonstochastic Observations

    NARCIS (Netherlands)

    Velsink, H.

    2016-01-01

    Adjustment and testing of a combination of stochastic and nonstochastic observations is applied to the deformation analysis of a time series of 3D coordinates. Nonstochastic observations are constant values that are treated as if they were observations. They are used to formulate constraints on

  8. Time Series Analysis of 3D Coordinates Using Nonstochastic Observations

    NARCIS (Netherlands)

    Hiddo Velsink

    2016-01-01

    From the article: Abstract Adjustment and testing of a combination of stochastic and nonstochastic observations is applied to the deformation analysis of a time series of 3D coordinates. Nonstochastic observations are constant values that are treated as if they were observations. They are used to

  9. Interglacial climate dynamics and advanced time series analysis

    Science.gov (United States)

    Mudelsee, Manfred; Bermejo, Miguel; Köhler, Peter; Lohmann, Gerrit

    2013-04-01

    , Fischer H, Joos F, Knutti R, Lohmann G, Masson-Delmotte V (2010) What caused Earth's temperature variations during the last 800,000 years? Data-based evidence on radiative forcing and constraints on climate sensitivity. Quaternary Science Reviews 29:129. Loulergue L, Schilt A, Spahni R, Masson-Delmotte V, Blunier T, Lemieux B, Barnola J-M, Raynaud D, Stocker TF, Chappellaz J (2008) Orbital and millennial-scale features of atmospheric CH4 over the past 800,000 years. Nature 453:383. L¨ü thi D, Le Floch M, Bereiter B, Blunier T, Barnola J-M, Siegenthaler U, Raynaud D, Jouzel J, Fischer H, Kawamura K, Stocker TF (2008) High-resolution carbon dioxide concentration record 650,000-800,000 years before present. Nature 453:379. Mudelsee M (2000) Ramp function regression: A tool for quantifying climate transitions. Computers and Geosciences 26:293. Mudelsee M (2002) TAUEST: A computer program for estimating persistence in unevenly spaced weather/climate time series. Computers and Geosciences 28:69. Mudelsee M (2010) Climate Time Series Analysis: Classical Statistical and Bootstrap Methods. Springer, Dordrecht, 474 pp. [www.manfredmudelsee.com/book] Siegenthaler U, Stocker TF, Monnin E, L¨ü thi D, Schwander J, Stauffer B, Raynaud D, Barnola J-M, Fischer H, Masson-Delmotte V, Jouzel J (2005) Stable carbon cycle-climate relationship during the late Pleistocene. Science 310:1313.

  10. ALBEDO PATTERN RECOGNITION AND TIME-SERIES ANALYSES IN MALAYSIA

    Directory of Open Access Journals (Sweden)

    S. A. Salleh

    2012-07-01

    Full Text Available Pattern recognition and time-series analyses will enable one to evaluate and generate predictions of specific phenomena. The albedo pattern and time-series analyses are very much useful especially in relation to climate condition monitoring. This study is conducted to seek for Malaysia albedo pattern changes. The pattern recognition and changes will be useful for variety of environmental and climate monitoring researches such as carbon budgeting and aerosol mapping. The 10 years (2000–2009 MODIS satellite images were used for the analyses and interpretation. These images were being processed using ERDAS Imagine remote sensing software, ArcGIS 9.3, the 6S code for atmospherical calibration and several MODIS tools (MRT, HDF2GIS, Albedo tools. There are several methods for time-series analyses were explored, this paper demonstrates trends and seasonal time-series analyses using converted HDF format MODIS MCD43A3 albedo land product. The results revealed significance changes of albedo percentages over the past 10 years and the pattern with regards to Malaysia's nebulosity index (NI and aerosol optical depth (AOD. There is noticeable trend can be identified with regards to its maximum and minimum value of the albedo. The rise and fall of the line graph show a similar trend with regards to its daily observation. The different can be identified in term of the value or percentage of rises and falls of albedo. Thus, it can be concludes that the temporal behavior of land surface albedo in Malaysia have a uniform behaviours and effects with regards to the local monsoons. However, although the average albedo shows linear trend with nebulosity index, the pattern changes of albedo with respects to the nebulosity index indicates that there are external factors that implicates the albedo values, as the sky conditions and its diffusion plotted does not have uniform trend over the years, especially when the trend of 5 years interval is examined, 2000 shows high

  11. Emerging interdependence between stock values during financial crashes

    OpenAIRE

    Rocchi, Jacopo; Tsui, Enoch Yan Lok; Saad, David

    2016-01-01

    To identify emerging interdependencies between traded stocks we investigate the behavior of the stocks of FTSE 100 companies in the period 2000-2015, by looking at daily stock values. Exploiting the power of information theoretical measures to extract direct influences between multiple time series, we compute the information flow across stock values to identify several different regimes. While small information flows is detected in most of the period, a dramatically different situation occurs...

  12. How Can Value-Added Measures Be Used for Teacher Improvement? What We Know Series: Value-Added Methods and Applications. Knowledge Brief 13

    Science.gov (United States)

    Loeb, Susanna

    2013-01-01

    The question for this brief is whether education leaders can use value-added measures as tools for improving schooling and, if so, how to do this. Districts, states, and schools can, at least in theory, generate gains in educational outcomes for students using value-added measures in three ways: creating information on effective programs, making…

  13. Adaptive Anchoring Model: How Static and Dynamic Presentations of Time Series Influence Judgments and Predictions.

    Science.gov (United States)

    Kusev, Petko; van Schaik, Paul; Tsaneva-Atanasova, Krasimira; Juliusson, Asgeir; Chater, Nick

    2018-01-01

    When attempting to predict future events, people commonly rely on historical data. One psychological characteristic of judgmental forecasting of time series, established by research, is that when people make forecasts from series, they tend to underestimate future values for upward trends and overestimate them for downward ones, so-called trend-damping (modeled by anchoring on, and insufficient adjustment from, the average of recent time series values). Events in a time series can be experienced sequentially (dynamic mode), or they can also be retrospectively viewed simultaneously (static mode), not experienced individually in real time. In one experiment, we studied the influence of presentation mode (dynamic and static) on two sorts of judgment: (a) predictions of the next event (forecast) and (b) estimation of the average value of all the events in the presented series (average estimation). Participants' responses in dynamic mode were anchored on more recent events than in static mode for all types of judgment but with different consequences; hence, dynamic presentation improved prediction accuracy, but not estimation. These results are not anticipated by existing theoretical accounts; we develop and present an agent-based model-the adaptive anchoring model (ADAM)-to account for the difference between processing sequences of dynamically and statically presented stimuli (visually presented data). ADAM captures how variation in presentation mode produces variation in responses (and the accuracy of these responses) in both forecasting and judgment tasks. ADAM's model predictions for the forecasting and judgment tasks fit better with the response data than a linear-regression time series model. Moreover, ADAM outperformed autoregressive-integrated-moving-average (ARIMA) and exponential-smoothing models, while neither of these models accounts for people's responses on the average estimation task. Copyright © 2017 The Authors. Cognitive Science published by Wiley

  14. Travel Series as TV Entertainment: Genre characteristics and touristic views on foreign countries

    Directory of Open Access Journals (Sweden)

    Anne Marit Waade

    2009-04-01

    Full Text Available Why is it not the deprived developing country, but rather the tempting destination the host arrives in when guiding the audience in a travel series? And how can we explore the specific combination of entertainment and education that travel series represent? Basically the travel series genre is a hybrid of journalistic documentary, entertaining lifestyle series and TV ads and the different series put different emphasis on the different genre elements. Travel series represent a certain kind of mediated consumption and they reflect lifestyle identity in relation to touristic consumer cultures. Like other lifestyle series dealing with consumption products and lifestyle markers encompassing fashion, food, garden, design and interior that balance somewhere between journalism and advertising, travel series typically deal with destinations, travel modes, cultural experiences and food as commodities. To understand the cultural and democratic value of travel series as a popular TV genre in the context of public service broadcasting, it is not the fact that the series contain educative and enlightening information about foreign cultures told in an entertaining and popular way that are of my interest. Rather it is tourism and media consumer culture as such, one has to expound as valuable democratic and cultural practice. The article presents different matrices of the respectively cultural and consumer knowledge that the different types of travel series include.

  15. Conditional mode regression: Application to functional time series prediction

    OpenAIRE

    Dabo-Niang, Sophie; Laksaci, Ali

    2008-01-01

    We consider $\\alpha$-mixing observations and deal with the estimation of the conditional mode of a scalar response variable $Y$ given a random variable $X$ taking values in a semi-metric space. We provide a convergence rate in $L^p$ norm of the estimator. A useful and typical application to functional times series prediction is given.

  16. Interpolation techniques used for data quality control and calculation of technical series: an example of a Central European daily time series

    Czech Academy of Sciences Publication Activity Database

    Štěpánek, P.; Zahradníček, P.; Huth, Radan

    2011-01-01

    Roč. 115, 1-2 (2011), s. 87-98 ISSN 0324-6329 R&D Projects: GA ČR GA205/08/1619 Institutional research plan: CEZ:AV0Z30420517 Keywords : data quality control * filling missing values * interpolation techniques * climatological time series Subject RIV: DG - Athmosphere Sciences, Meteorology Impact factor: 0.364, year: 2011 http://www.met.hu/en/ismeret-tar/kiadvanyok/idojaras/index.php?id=34

  17. STATIONARITY OF ANNUAL MAXIMUM DAILY STREAMFLOW TIME SERIES IN SOUTH-EAST BRAZILIAN RIVERS

    Directory of Open Access Journals (Sweden)

    Jorge Machado Damázio

    2015-08-01

    Full Text Available DOI: 10.12957/cadest.2014.18302The paper presents a statistical analysis of annual maxima daily streamflow between 1931 and 2013 in South-East Brazil focused in detecting and modelling non-stationarity aspects. Flood protection for the large valleys in South-East Brazil is provided by multiple purpose reservoir systems built during 20th century, which design and operation plans has been done assuming stationarity of historical flood time series. Land cover changes and rapidly-increasing level of atmosphere greenhouse gases of the last century may be affecting flood regimes in these valleys so that it can be that nonstationary modelling should be applied to re-asses dam safety and flood control operation rules at the existent reservoir system. Six annual maximum daily streamflow time series are analysed. The time series were plotted together with fitted smooth loess functions and non-parametric statistical tests are performed to check the significance of apparent trends shown by the plots. Non-stationarity is modelled by fitting univariate extreme value distribution functions which location varies linearly with time. Stationarity and non-stationarity modelling are compared with the likelihood ratio statistic. In four of the six analyzed time series non-stationarity modelling outperformed stationarity modelling.Keywords: Stationarity; Extreme Value Distributions; Flood Frequency Analysis; Maximum Likelihood Method.

  18. Clinical time series prediction: Toward a hierarchical dynamical system framework.

    Science.gov (United States)

    Liu, Zitao; Hauskrecht, Milos

    2015-09-01

    Developing machine learning and data mining algorithms for building temporal models of clinical time series is important for understanding of the patient condition, the dynamics of a disease, effect of various patient management interventions and clinical decision making. In this work, we propose and develop a novel hierarchical framework for modeling clinical time series data of varied length and with irregularly sampled observations. Our hierarchical dynamical system framework for modeling clinical time series combines advantages of the two temporal modeling approaches: the linear dynamical system and the Gaussian process. We model the irregularly sampled clinical time series by using multiple Gaussian process sequences in the lower level of our hierarchical framework and capture the transitions between Gaussian processes by utilizing the linear dynamical system. The experiments are conducted on the complete blood count (CBC) panel data of 1000 post-surgical cardiac patients during their hospitalization. Our framework is evaluated and compared to multiple baseline approaches in terms of the mean absolute prediction error and the absolute percentage error. We tested our framework by first learning the time series model from data for the patients in the training set, and then using it to predict future time series values for the patients in the test set. We show that our model outperforms multiple existing models in terms of its predictive accuracy. Our method achieved a 3.13% average prediction accuracy improvement on ten CBC lab time series when it was compared against the best performing baseline. A 5.25% average accuracy improvement was observed when only short-term predictions were considered. A new hierarchical dynamical system framework that lets us model irregularly sampled time series data is a promising new direction for modeling clinical time series and for improving their predictive performance. Copyright © 2014 Elsevier B.V. All rights reserved.

  19. Clinical time series prediction: towards a hierarchical dynamical system framework

    Science.gov (United States)

    Liu, Zitao; Hauskrecht, Milos

    2014-01-01

    Objective Developing machine learning and data mining algorithms for building temporal models of clinical time series is important for understanding of the patient condition, the dynamics of a disease, effect of various patient management interventions and clinical decision making. In this work, we propose and develop a novel hierarchical framework for modeling clinical time series data of varied length and with irregularly sampled observations. Materials and methods Our hierarchical dynamical system framework for modeling clinical time series combines advantages of the two temporal modeling approaches: the linear dynamical system and the Gaussian process. We model the irregularly sampled clinical time series by using multiple Gaussian process sequences in the lower level of our hierarchical framework and capture the transitions between Gaussian processes by utilizing the linear dynamical system. The experiments are conducted on the complete blood count (CBC) panel data of 1000 post-surgical cardiac patients during their hospitalization. Our framework is evaluated and compared to multiple baseline approaches in terms of the mean absolute prediction error and the absolute percentage error. Results We tested our framework by first learning the time series model from data for the patient in the training set, and then applying the model in order to predict future time series values on the patients in the test set. We show that our model outperforms multiple existing models in terms of its predictive accuracy. Our method achieved a 3.13% average prediction accuracy improvement on ten CBC lab time series when it was compared against the best performing baseline. A 5.25% average accuracy improvement was observed when only short-term predictions were considered. Conclusion A new hierarchical dynamical system framework that lets us model irregularly sampled time series data is a promising new direction for modeling clinical time series and for improving their predictive

  20. Performing T-tests to Compare Autocorrelated Time Series Data Collected from Direct-Reading Instruments.

    Science.gov (United States)

    O'Shaughnessy, Patrick; Cavanaugh, Joseph E

    2015-01-01

    Industrial hygienists now commonly use direct-reading instruments to evaluate hazards in the workplace. The stored values over time from these instruments constitute a time series of measurements that are often autocorrelated. Given the need to statistically compare two occupational scenarios using values from a direct-reading instrument, a t-test must consider measurement autocorrelation or the resulting test will have a largely inflated type-1 error probability (false rejection of the null hypothesis). A method is described for both the one-sample and two-sample cases which properly adjusts for autocorrelation. This method involves the computation of an "equivalent sample size" that effectively decreases the actual sample size when determining the standard error of the mean for the time series. An example is provided for the one-sample case, and an example is given where a two-sample t-test is conducted for two autocorrelated time series comprised of lognormally distributed measurements.

  1. Analysis of local ionospheric time varying characteristics with singular value decomposition

    DEFF Research Database (Denmark)

    Jakobsen, Jakob Anders; Knudsen, Per; Jensen, Anna B. O.

    2010-01-01

    In this paper, a time series from 1999 to 2007 of absolute total electron content (TEC) values has been computed and analyzed using singular value decomposition (SVD). The data set has been computed using a Kalman Filter and is based on dual frequency GPS data from three reference stations in Den...

  2. Quality Control Procedure Based on Partitioning of NMR Time Series

    Directory of Open Access Journals (Sweden)

    Michał Staniszewski

    2018-03-01

    Full Text Available The quality of the magnetic resonance spectroscopy (MRS depends on the stability of magnetic resonance (MR system performance and optimal hardware functioning, which ensure adequate levels of signal-to-noise ratios (SNR as well as good spectral resolution and minimal artifacts in the spectral data. MRS quality control (QC protocols and methodologies are based on phantom measurements that are repeated regularly. In this work, a signal partitioning algorithm based on a dynamic programming (DP method for QC assessment of the spectral data is described. The proposed algorithm allows detection of the change points—the abrupt variations in the time series data. The proposed QC method was tested using the simulated and real phantom data. Simulated data were randomly generated time series distorted by white noise. The real data were taken from the phantom quality control studies of the MRS scanner collected for four and a half years and analyzed by LCModel software. Along with the proposed algorithm, performance of various literature methods was evaluated for the predefined number of change points based on the error values calculated by subtracting the mean values calculated for the periods between the change-points from the original data points. The time series were checked using external software, a set of external methods and the proposed tool, and the obtained results were comparable. The application of dynamic programming in the analysis of the phantom MRS data is a novel approach to QC. The obtained results confirm that the presented change-point-detection tool can be used either for independent analysis of MRS time series (or any other or as a part of quality control.

  3. Evaluation of random errors in Williams’ series coefficients obtained with digital image correlation

    International Nuclear Information System (INIS)

    Lychak, Oleh V; Holyns’kiy, Ivan S

    2016-01-01

    The use of the Williams’ series parameters for fracture analysis requires valid information about their error values. The aim of this investigation is the development of the method for estimation of the standard deviation of random errors of the Williams’ series parameters, obtained from the measured components of the stress field. Also, the criteria for choosing the optimal number of terms in the truncated Williams’ series for derivation of their parameters with minimal errors is proposed. The method was used for the evaluation of the Williams’ parameters, obtained from the data, and measured by the digital image correlation technique for testing a three-point bending specimen. (paper)

  4. Unstable Periodic Orbit Analysis of Histograms of Chaotic Time Series

    International Nuclear Information System (INIS)

    Zoldi, S.M.

    1998-01-01

    Using the Lorenz equations, we have investigated whether unstable periodic orbits (UPOs) associated with a strange attractor may predict the occurrence of the robust sharp peaks in histograms of some experimental chaotic time series. Histograms with sharp peaks occur for the Lorenz parameter value r=60.0 but not for r=28.0 , and the sharp peaks for r=60.0 do not correspond to a histogram derived from any single UPO. However, we show that histograms derived from the time series of a non-Axiom-A chaotic system can be accurately predicted by an escape-time weighting of UPO histograms. copyright 1998 The American Physical Society

  5. New insights into soil temperature time series modeling: linear or nonlinear?

    Science.gov (United States)

    Bonakdari, Hossein; Moeeni, Hamid; Ebtehaj, Isa; Zeynoddin, Mohammad; Mahoammadian, Abdolmajid; Gharabaghi, Bahram

    2018-03-01

    Soil temperature (ST) is an important dynamic parameter, whose prediction is a major research topic in various fields including agriculture because ST has a critical role in hydrological processes at the soil surface. In this study, a new linear methodology is proposed based on stochastic methods for modeling daily soil temperature (DST). With this approach, the ST series components are determined to carry out modeling and spectral analysis. The results of this process are compared with two linear methods based on seasonal standardization and seasonal differencing in terms of four DST series. The series used in this study were measured at two stations, Champaign and Springfield, at depths of 10 and 20 cm. The results indicate that in all ST series reviewed, the periodic term is the most robust among all components. According to a comparison of the three methods applied to analyze the various series components, it appears that spectral analysis combined with stochastic methods outperformed the seasonal standardization and seasonal differencing methods. In addition to comparing the proposed methodology with linear methods, the ST modeling results were compared with the two nonlinear methods in two forms: considering hydrological variables (HV) as input variables and DST modeling as a time series. In a previous study at the mentioned sites, Kim and Singh Theor Appl Climatol 118:465-479, (2014) applied the popular Multilayer Perceptron (MLP) neural network and Adaptive Neuro-Fuzzy Inference System (ANFIS) nonlinear methods and considered HV as input variables. The comparison results signify that the relative error projected in estimating DST by the proposed methodology was about 6%, while this value with MLP and ANFIS was over 15%. Moreover, MLP and ANFIS models were employed for DST time series modeling. Due to these models' relatively inferior performance to the proposed methodology, two hybrid models were implemented: the weights and membership function of MLP and

  6. Divergent Perturbation Series

    International Nuclear Information System (INIS)

    Suslov, I.M.

    2005-01-01

    Various perturbation series are factorially divergent. The behavior of their high-order terms can be determined by Lipatov's method, which involves the use of instanton configurations of appropriate functional integrals. When the Lipatov asymptotic form is known and several lowest order terms of the perturbation series are found by direct calculation of diagrams, one can gain insight into the behavior of the remaining terms of the series, which can be resummed to solve various strong-coupling problems in a certain approximation. This approach is demonstrated by determining the Gell-Mann-Low functions in φ 4 theory, QED, and QCD with arbitrary coupling constants. An overview of the mathematical theory of divergent series is presented, and interpretation of perturbation series is discussed. Explicit derivations of the Lipatov asymptotic form are presented for some basic problems in theoretical physics. A solution is proposed to the problem of renormalon contributions, which hampered progress in this field in the late 1970s. Practical perturbation-series summation schemes are described both for a coupling constant of order unity and in the strong-coupling limit. An interpretation of the Borel integral is given for 'non-Borel-summable' series. Higher order corrections to the Lipatov asymptotic form are discussed

  7. Answers to selected problems in multivariable calculus with linear algebra and series

    CERN Document Server

    Trench, William F

    1972-01-01

    Answers to Selected Problems in Multivariable Calculus with Linear Algebra and Series contains the answers to selected problems in linear algebra, the calculus of several variables, and series. Topics covered range from vectors and vector spaces to linear matrices and analytic geometry, as well as differential calculus of real-valued functions. Theorems and definitions are included, most of which are followed by worked-out illustrative examples.The problems and corresponding solutions deal with linear equations and matrices, including determinants; vector spaces and linear transformations; eig

  8. Springer Autopsy of measurements with the ATLAS detector at the LHC

    CERN Document Server

    Beauchemin, Pierre-Hugues

    2017-01-01

    A lot of attention has been devoted to the study of discoveries in high energy physics (HEP), but less on measurements aiming at improving an existing theory like the standard model of particle physics, getting more precise values for the parameters of the theory or establishing relationships between them. This paper provides a detailed and critical study of how measurements are performed in recent HEP experiments, taking examples from differential cross section measurements with the ATLAS detector at the LHC. This study will be used to provide an elucidation of the concept of event used in HEP, in order to determine what constitutes an observation and what does not. It will highlight the essential place taken by theory-ladenness in order to produce observational facts, and will show how uncertainty and sensitivity estimates constitute an operational approach to robustness, inside the practice of science, avoiding potential circularity problem traditionally implied by theory-ladenness. This is in contrast to ...

  9. Algorithm for predicting the evolution of series of dynamics of complex systems in solving information problems

    Science.gov (United States)

    Kasatkina, T. I.; Dushkin, A. V.; Pavlov, V. A.; Shatovkin, R. R.

    2018-03-01

    In the development of information, systems and programming to predict the series of dynamics, neural network methods have recently been applied. They are more flexible, in comparison with existing analogues and are capable of taking into account the nonlinearities of the series. In this paper, we propose a modified algorithm for predicting the series of dynamics, which includes a method for training neural networks, an approach to describing and presenting input data, based on the prediction by the multilayer perceptron method. To construct a neural network, the values of a series of dynamics at the extremum points and time values corresponding to them, formed based on the sliding window method, are used as input data. The proposed algorithm can act as an independent approach to predicting the series of dynamics, and be one of the parts of the forecasting system. The efficiency of predicting the evolution of the dynamics series for a short-term one-step and long-term multi-step forecast by the classical multilayer perceptron method and a modified algorithm using synthetic and real data is compared. The result of this modification was the minimization of the magnitude of the iterative error that arises from the previously predicted inputs to the inputs to the neural network, as well as the increase in the accuracy of the iterative prediction of the neural network.

  10. On the series

    Indian Academy of Sciences (India)

    2016-08-26

    Aug 26, 2016 ... http://www.ias.ac.in/article/fulltext/pmsc/115/04/0371-0381. Keywords. Inverse binomial series; hypergeometric series; polylogarithms; integral representations. Abstract. In this paper we investigate the series ∑ k = 1 ∞ ( 3 k k ) − 1 k − n x k . Obtaining some integral representations of them, we evaluated the ...

  11. Time series analysis time series analysis methods and applications

    CERN Document Server

    Rao, Tata Subba; Rao, C R

    2012-01-01

    The field of statistics not only affects all areas of scientific activity, but also many other matters such as public policy. It is branching rapidly into so many different subjects that a series of handbooks is the only way of comprehensively presenting the various aspects of statistical methodology, applications, and recent developments. The Handbook of Statistics is a series of self-contained reference books. Each volume is devoted to a particular topic in statistics, with Volume 30 dealing with time series. The series is addressed to the entire community of statisticians and scientists in various disciplines who use statistical methodology in their work. At the same time, special emphasis is placed on applications-oriented techniques, with the applied statistician in mind as the primary audience. Comprehensively presents the various aspects of statistical methodology Discusses a wide variety of diverse applications and recent developments Contributors are internationally renowened experts in their respect...

  12. Patient-perceived value of Medication Therapy Management (MTM services: a series of focus groups

    Directory of Open Access Journals (Sweden)

    Heidi Schultz

    2012-01-01

    Full Text Available Objective: To determine the patient-perceived value of MTM services and non-financial barriers preventing patients with insurance coverage from receiving MTM services. Design: Focus groups. Setting: Fairview Pharmacy Services, Minneapolis, MN. Participants: Three focus groups, each with five to nine participants, consisting of different participant populations: (i patients who paid out-of-pocket to receive MTM services; (ii insurance beneficiaries, under which MTM is a covered benefit and participants may have received incentives for receiving MTM services; (iii patients with an insurance plan which covers MTM services who were recruited to receive MTM services but declined. Intervention: MTM services. Main Outcome Measure: Patient-perceived value of MTM services and non-financial barriers. Results: Seven themes were identified relating to the patient-perceived value of MTM services: collaboration of the health care team, MTM pharmacist as a supporter/advocate/confidant, MTM pharmacist as a resource for questions and education, accessibility to the MTM pharmacist, financial incentives for participation in MTM services, MTM pharmacy as a specialty field, and the MTM pharmacist as a coordinator. Three themes were identified regarding patient-perceived non-financial barriers to receiving MTM services, including: availability of the MTM pharmacist, patient/physician lack of knowledge of MTM services, patient's belief that MTM services are not needed. Conclusion: MTM is a service which patients identify as valuable. Patients are able to identify non-financial barriers that may prevent some patients from receiving MTM services. This study provides preliminary evidence of both the value and barriers perceived by patients.   Type: Original Research

  13. Patient-perceived value of Medication Therapy Management (MTM services: a series of focus groups

    Directory of Open Access Journals (Sweden)

    Amanda Brummel, PharmD

    2012-01-01

    Full Text Available Objective: To determine the patient-perceived value of MTM services and non-financial barriers preventing patients with insurance coverage from receiving MTM services. Design: Focus groups. Setting: Fairview Pharmacy Services, Minneapolis, MN.Participants: Three focus groups, each with five to nine participants, consisting of different participant populations: (i patients who paid out-of-pocket to receive MTM services; (ii insurance beneficiaries, under which MTM is a covered benefit and participants may have received incentives for receiving MTM services; (iii patients with an insurance plan which covers MTM services who were recruited to receive MTM services but declined. Intervention: MTM services. Main Outcome Measure: Patient-perceived value of MTM services and non-financial barriers. Results: Seven themes were identified relating to the patient-perceived value of MTM services: collaboration of the health care team, MTM pharmacist as a supporter/advocate/confidant, MTM pharmacist as a resource for questions and education, accessibility to the MTM pharmacist, financial incentives for participation in MTM services, MTM pharmacy as a specialty field, and the MTM pharmacist as a coordinator. Three themes were identified regarding patient-perceived non-financial barriers to receiving MTM services, including: availability of the MTM pharmacist, patient/physician lack of knowledge of MTM services, patient’s belief that MTM services are not needed. Conclusion: MTM is a service which patients identify as valuable. Patients are able to identify non-financial barriers that may prevent some patients from receiving MTM services. This study provides preliminary evidence of both the value and barriers perceived by patients.

  14. Cross-sample entropy of foreign exchange time series

    Science.gov (United States)

    Liu, Li-Zhi; Qian, Xi-Yuan; Lu, Heng-Yao

    2010-11-01

    The correlation of foreign exchange rates in currency markets is investigated based on the empirical data of DKK/USD, NOK/USD, CAD/USD, JPY/USD, KRW/USD, SGD/USD, THB/USD and TWD/USD for a period from 1995 to 2002. Cross-SampEn (cross-sample entropy) method is used to compare the returns of every two exchange rate time series to assess their degree of asynchrony. The calculation method of confidence interval of SampEn is extended and applied to cross-SampEn. The cross-SampEn and its confidence interval for every two of the exchange rate time series in periods 1995-1998 (before the Asian currency crisis) and 1999-2002 (after the Asian currency crisis) are calculated. The results show that the cross-SampEn of every two of these exchange rates becomes higher after the Asian currency crisis, indicating a higher asynchrony between the exchange rates. Especially for Singapore, Thailand and Taiwan, the cross-SampEn values after the Asian currency crisis are significantly higher than those before the Asian currency crisis. Comparison with the correlation coefficient shows that cross-SampEn is superior to describe the correlation between time series.

  15. Clustering Multivariate Time Series Using Hidden Markov Models

    Directory of Open Access Journals (Sweden)

    Shima Ghassempour

    2014-03-01

    Full Text Available In this paper we describe an algorithm for clustering multivariate time series with variables taking both categorical and continuous values. Time series of this type are frequent in health care, where they represent the health trajectories of individuals. The problem is challenging because categorical variables make it difficult to define a meaningful distance between trajectories. We propose an approach based on Hidden Markov Models (HMMs, where we first map each trajectory into an HMM, then define a suitable distance between HMMs and finally proceed to cluster the HMMs with a method based on a distance matrix. We test our approach on a simulated, but realistic, data set of 1,255 trajectories of individuals of age 45 and over, on a synthetic validation set with known clustering structure, and on a smaller set of 268 trajectories extracted from the longitudinal Health and Retirement Survey. The proposed method can be implemented quite simply using standard packages in R and Matlab and may be a good candidate for solving the difficult problem of clustering multivariate time series with categorical variables using tools that do not require advanced statistic knowledge, and therefore are accessible to a wide range of researchers.

  16. Modeling pollen time series using seasonal-trend decomposition procedure based on LOESS smoothing.

    Science.gov (United States)

    Rojo, Jesús; Rivero, Rosario; Romero-Morte, Jorge; Fernández-González, Federico; Pérez-Badia, Rosa

    2017-02-01

    Analysis of airborne pollen concentrations provides valuable information on plant phenology and is thus a useful tool in agriculture-for predicting harvests in crops such as the olive and for deciding when to apply phytosanitary treatments-as well as in medicine and the environmental sciences. Variations in airborne pollen concentrations, moreover, are indicators of changing plant life cycles. By modeling pollen time series, we can not only identify the variables influencing pollen levels but also predict future pollen concentrations. In this study, airborne pollen time series were modeled using a seasonal-trend decomposition procedure based on LOcally wEighted Scatterplot Smoothing (LOESS) smoothing (STL). The data series-daily Poaceae pollen concentrations over the period 2006-2014-was broken up into seasonal and residual (stochastic) components. The seasonal component was compared with data on Poaceae flowering phenology obtained by field sampling. Residuals were fitted to a model generated from daily temperature and rainfall values, and daily pollen concentrations, using partial least squares regression (PLSR). This method was then applied to predict daily pollen concentrations for 2014 (independent validation data) using results for the seasonal component of the time series and estimates of the residual component for the period 2006-2013. Correlation between predicted and observed values was r = 0.79 (correlation coefficient) for the pre-peak period (i.e., the period prior to the peak pollen concentration) and r = 0.63 for the post-peak period. Separate analysis of each of the components of the pollen data series enables the sources of variability to be identified more accurately than by analysis of the original non-decomposed data series, and for this reason, this procedure has proved to be a suitable technique for analyzing the main environmental factors influencing airborne pollen concentrations.

  17. Sizing of the Series Dynamic Breaking Resistor in a Doubly Fed Induction Generator Wind Turbine

    DEFF Research Database (Denmark)

    Soliman, Hammam; Wang, Huai; Zhou, Dao

    2014-01-01

    This paper investigates the effect of Series Dynamic Breaking Resistor (SDBR) sizing on a Doubly Fed Induction Generator (DFIG) based wind power conversion system. The boundary of the SDBR value is firstly derived by taking into account the controllability of the rotor side converter and the maxi......This paper investigates the effect of Series Dynamic Breaking Resistor (SDBR) sizing on a Doubly Fed Induction Generator (DFIG) based wind power conversion system. The boundary of the SDBR value is firstly derived by taking into account the controllability of the rotor side converter...... and the maximum allowable voltage of the stator. Then the impact of the SDBR value on the rotor current, stator voltage, DC-link voltage, reactive power capability and introduced power loss during voltage sag operation is evaluated by simulation. The presented study enables a trade-off sizing of the SDBR among...

  18. On-line analysis of reactor noise using time-series analysis

    International Nuclear Information System (INIS)

    McGevna, V.G.

    1981-10-01

    A method to allow use of time series analysis for on-line noise analysis has been developed. On-line analysis of noise in nuclear power reactors has been limited primarily to spectral analysis and related frequency domain techniques. Time series analysis has many distinct advantages over spectral analysis in the automated processing of reactor noise. However, fitting an autoregressive-moving average (ARMA) model to time series data involves non-linear least squares estimation. Unless a high speed, general purpose computer is available, the calculations become too time consuming for on-line applications. To eliminate this problem, a special purpose algorithm was developed for fitting ARMA models. While it is based on a combination of steepest descent and Taylor series linearization, properties of the ARMA model are used so that the auto- and cross-correlation functions can be used to eliminate the need for estimating derivatives. The number of calculations, per iteration varies lineegardless of the mee 0.2% yield strength displayed anisotropy, with axial and circumferential values being greater than radial. For CF8-CPF8 and CF8M-CPF8M castings to meet current ASME Code S acid fuel cells

  19. Long Range Dependence Prognostics for Bearing Vibration Intensity Chaotic Time Series

    Directory of Open Access Journals (Sweden)

    Qing Li

    2016-01-01

    Full Text Available According to the chaotic features and typical fractional order characteristics of the bearing vibration intensity time series, a forecasting approach based on long range dependence (LRD is proposed. In order to reveal the internal chaotic properties, vibration intensity time series are reconstructed based on chaos theory in phase-space, the delay time is computed with C-C method and the optimal embedding dimension and saturated correlation dimension are calculated via the Grassberger–Procaccia (G-P method, respectively, so that the chaotic characteristics of vibration intensity time series can be jointly determined by the largest Lyapunov exponent and phase plane trajectory of vibration intensity time series, meanwhile, the largest Lyapunov exponent is calculated by the Wolf method and phase plane trajectory is illustrated using Duffing-Holmes Oscillator (DHO. The Hurst exponent and long range dependence prediction method are proposed to verify the typical fractional order features and improve the prediction accuracy of bearing vibration intensity time series, respectively. Experience shows that the vibration intensity time series have chaotic properties and the LRD prediction method is better than the other prediction methods (largest Lyapunov, auto regressive moving average (ARMA and BP neural network (BPNN model in prediction accuracy and prediction performance, which provides a new approach for running tendency predictions for rotating machinery and provide some guidance value to the engineering practice.

  20. Connected to TV series: Quantifying series watching engagement.

    Science.gov (United States)

    Tóth-Király, István; Bőthe, Beáta; Tóth-Fáber, Eszter; Hága, Győző; Orosz, Gábor

    2017-12-01

    Background and aims Television series watching stepped into a new golden age with the appearance of online series. Being highly involved in series could potentially lead to negative outcomes, but the distinction between highly engaged and problematic viewers should be distinguished. As no appropriate measure is available for identifying such differences, a short and valid measure was constructed in a multistudy investigation: the Series Watching Engagement Scale (SWES). Methods In Study 1 (N Sample1  = 740 and N Sample2  = 740), exploratory structural equation modeling and confirmatory factor analysis were used to identify the most important facets of series watching engagement. In Study 2 (N = 944), measurement invariance of the SWES was investigated between males and females. In Study 3 (N = 1,520), latent profile analysis (LPA) was conducted to identify subgroups of viewers. Results Five factors of engagement were identified in Study 1 that are of major relevance: persistence, identification, social interaction, overuse, and self-development. Study 2 supported the high levels of equivalence between males and females. In Study 3, three groups of viewers (low-, medium-, and high-engagement viewers) were identified. The highly engaged at-risk group can be differentiated from the other two along key variables of watching time and personality. Discussion The present findings support the overall validity, reliability, and usefulness of the SWES and the results of the LPA showed that it might be useful to identify at-risk viewers before the development of problematic use.

  1. Happiness in the Context of European Human Values

    Directory of Open Access Journals (Sweden)

    Paun Maria

    2016-07-01

    Full Text Available When we think about our values, we think about what is important in our lives (security, independence, wisdom, success, goodness, pleasure. This article has as its starting point the assumption that happiness is the result of manifestation of human values. The first part deals with the concept of human values, and the second part is focused on the analysis of secondary sources (data obtained from the investigation conducted by the European Social Survey. Data was retrieved and processed by me in Excel, and SPSS. In order to test research hypotheses correlation was used. To support the argument we used a series of tables and representative images.

  2. A simple and fast representation space for classifying complex time series

    International Nuclear Information System (INIS)

    Zunino, Luciano; Olivares, Felipe; Bariviera, Aurelio F.; Rosso, Osvaldo A.

    2017-01-01

    In the context of time series analysis considerable effort has been directed towards the implementation of efficient discriminating statistical quantifiers. Very recently, a simple and fast representation space has been introduced, namely the number of turning points versus the Abbe value. It is able to separate time series from stationary and non-stationary processes with long-range dependences. In this work we show that this bidimensional approach is useful for distinguishing complex time series: different sets of financial and physiological data are efficiently discriminated. Additionally, a multiscale generalization that takes into account the multiple time scales often involved in complex systems has been also proposed. This multiscale analysis is essential to reach a higher discriminative power between physiological time series in health and disease. - Highlights: • A bidimensional scheme has been tested for classification purposes. • A multiscale generalization is introduced. • Several practical applications confirm its usefulness. • Different sets of financial and physiological data are efficiently distinguished. • This multiscale bidimensional approach has high potential as discriminative tool.

  3. A simple and fast representation space for classifying complex time series

    Energy Technology Data Exchange (ETDEWEB)

    Zunino, Luciano, E-mail: lucianoz@ciop.unlp.edu.ar [Centro de Investigaciones Ópticas (CONICET La Plata – CIC), C.C. 3, 1897 Gonnet (Argentina); Departamento de Ciencias Básicas, Facultad de Ingeniería, Universidad Nacional de La Plata (UNLP), 1900 La Plata (Argentina); Olivares, Felipe, E-mail: olivaresfe@gmail.com [Instituto de Física, Pontificia Universidad Católica de Valparaíso (PUCV), 23-40025 Valparaíso (Chile); Bariviera, Aurelio F., E-mail: aurelio.fernandez@urv.cat [Department of Business, Universitat Rovira i Virgili, Av. Universitat 1, 43204 Reus (Spain); Rosso, Osvaldo A., E-mail: oarosso@gmail.com [Instituto de Física, Universidade Federal de Alagoas (UFAL), BR 104 Norte km 97, 57072-970, Maceió, Alagoas (Brazil); Instituto Tecnológico de Buenos Aires (ITBA) and CONICET, C1106ACD, Av. Eduardo Madero 399, Ciudad Autónoma de Buenos Aires (Argentina); Complex Systems Group, Facultad de Ingeniería y Ciencias Aplicadas, Universidad de los Andes, Av. Mons. Álvaro del Portillo 12.455, Las Condes, Santiago (Chile)

    2017-03-18

    In the context of time series analysis considerable effort has been directed towards the implementation of efficient discriminating statistical quantifiers. Very recently, a simple and fast representation space has been introduced, namely the number of turning points versus the Abbe value. It is able to separate time series from stationary and non-stationary processes with long-range dependences. In this work we show that this bidimensional approach is useful for distinguishing complex time series: different sets of financial and physiological data are efficiently discriminated. Additionally, a multiscale generalization that takes into account the multiple time scales often involved in complex systems has been also proposed. This multiscale analysis is essential to reach a higher discriminative power between physiological time series in health and disease. - Highlights: • A bidimensional scheme has been tested for classification purposes. • A multiscale generalization is introduced. • Several practical applications confirm its usefulness. • Different sets of financial and physiological data are efficiently distinguished. • This multiscale bidimensional approach has high potential as discriminative tool.

  4. Correlation measure to detect time series distances, whence economy globalization

    Science.gov (United States)

    Miśkiewicz, Janusz; Ausloos, Marcel

    2008-11-01

    An instantaneous time series distance is defined through the equal time correlation coefficient. The idea is applied to the Gross Domestic Product (GDP) yearly increments of 21 rich countries between 1950 and 2005 in order to test the process of economic globalisation. Some data discussion is first presented to decide what (EKS, GK, or derived) GDP series should be studied. Distances are then calculated from the correlation coefficient values between pairs of series. The role of time averaging of the distances over finite size windows is discussed. Three network structures are next constructed based on the hierarchy of distances. It is shown that the mean distance between the most developed countries on several networks actually decreases in time, -which we consider as a proof of globalization. An empirical law is found for the evolution after 1990, similar to that found in flux creep. The optimal observation time window size is found ≃15 years.

  5. Research tools: ethylene preparation. In: Chi-Kuang Wen editor. Ethylene in plants. Springer Netherlands. Springer Link

    Science.gov (United States)

    Ethylene is a plant hormone that regulates many aspects of plant growth and development, germination, fruit ripening, senescence, sex determination, abscission, defense, gravitropism, epinasty, and more. For experimental purposes, one needs to treat plant material with ethylene and its inhibitors t...

  6. Value for money assessment for public-private partnerships : a primer.

    Science.gov (United States)

    2015-01-01

    This primer addresses Value for Money Assessment for public-private partnerships (P3s). Companion : primers on Financial Assessment and Risk Assessment for P3s are also available as part of this series of : primers.

  7. Inflation: Causes and Cures. Series on Public Issues No. 9.

    Science.gov (United States)

    Saving, Thomas R.

    This booklet, one of a series intended to apply economic principles to major social and political issues of the day, focuses on the relationship between growth of the money supply, growth of productivity, and inflation. Provided first is a definition of inflation along with discussions of price indexes, the value of money, and the concept of…

  8. Theories of Value and Problems of Education. Readings in Philosophy of Education Series.

    Science.gov (United States)

    Smith, Philip G., Ed.

    This volume, intended for advanced and specialized teacher education courses, approaches educational problems from one of the standard divisions of general philosophy. Noting that substantive or normative ethics has dominated the literature of values and education, rather than meta-ethnics or analytical ethics, the editor states that his intention…

  9. On the Nodal Lines of Eisenstein Series on Schottky Surfaces

    Science.gov (United States)

    Jakobson, Dmitry; Naud, Frédéric

    2017-04-01

    On convex co-compact hyperbolic surfaces {X=Γ backslash H2}, we investigate the behavior of nodal curves of real valued Eisenstein series {F_λ(z,ξ)}, where {λ} is the spectral parameter, {ξ} the direction at infinity. Eisenstein series are (non-{L^2}) eigenfunctions of the Laplacian {Δ_X} satisfying {Δ_X F_λ=(1/4+λ^2)F_λ}. As {λ} goes to infinity (the high energy limit), we show that, for generic {ξ}, the number of intersections of nodal lines with any compact segment of geodesic grows like {λ}, up to multiplicative constants. Applications to the number of nodal domains inside the convex core of the surface are then derived.

  10. Statistical significance approximation in local trend analysis of high-throughput time-series data using the theory of Markov chains.

    Science.gov (United States)

    Xia, Li C; Ai, Dongmei; Cram, Jacob A; Liang, Xiaoyi; Fuhrman, Jed A; Sun, Fengzhu

    2015-09-21

    Local trend (i.e. shape) analysis of time series data reveals co-changing patterns in dynamics of biological systems. However, slow permutation procedures to evaluate the statistical significance of local trend scores have limited its applications to high-throughput time series data analysis, e.g., data from the next generation sequencing technology based studies. By extending the theories for the tail probability of the range of sum of Markovian random variables, we propose formulae for approximating the statistical significance of local trend scores. Using simulations and real data, we show that the approximate p-value is close to that obtained using a large number of permutations (starting at time points >20 with no delay and >30 with delay of at most three time steps) in that the non-zero decimals of the p-values obtained by the approximation and the permutations are mostly the same when the approximate p-value is less than 0.05. In addition, the approximate p-value is slightly larger than that based on permutations making hypothesis testing based on the approximate p-value conservative. The approximation enables efficient calculation of p-values for pairwise local trend analysis, making large scale all-versus-all comparisons possible. We also propose a hybrid approach by integrating the approximation and permutations to obtain accurate p-values for significantly associated pairs. We further demonstrate its use with the analysis of the Polymouth Marine Laboratory (PML) microbial community time series from high-throughput sequencing data and found interesting organism co-occurrence dynamic patterns. The software tool is integrated into the eLSA software package that now provides accelerated local trend and similarity analysis pipelines for time series data. The package is freely available from the eLSA website: http://bitbucket.org/charade/elsa.

  11. Feasibility of estimating generalized extreme-value distribution of floods

    International Nuclear Information System (INIS)

    Ferreira de Queiroz, Manoel Moises

    2004-01-01

    Flood frequency analysis by generalized extreme-value probability distribution (GEV) has found increased application in recent years, given its flexibility in dealing with the three asymptotic forms of extreme distribution derived from different initial probability distributions. Estimation of higher quantiles of floods is usually accomplished by extrapolating one of the three inverse forms of GEV distribution fitted to the experimental data for return periods much higher than those actually observed. This paper studies the feasibility of fitting GEV distribution by moments of linear combinations of higher order statistics (LH moments) using synthetic annual flood series with varying characteristics and lengths. As the hydrologic events in nature such as daily discharge occur with finite values, their annual maximums are expected to follow the asymptotic form of the limited GEV distribution. Synthetic annual flood series were thus obtained from the stochastic sequences of 365 daily discharges generated by Monte Carlo simulation on the basis of limited probability distribution underlying the limited GEV distribution. The results show that parameter estimation by LH moments of this distribution, fitted to annual flood samples of less than 100-year length derived from initial limited distribution, may indicate any form of extreme-value distribution, not just the limited form as expected, and with large uncertainty in fitted parameters. A frequency analysis, on the basis of GEV distribution and LH moments, of annual flood series of lengths varying between 13 and 73 years observed at 88 gauge stations on Parana River in Brazil, indicated all the three forms of GEV distribution.(Author)

  12. Comparison of time-series registration methods in breast dynamic infrared imaging

    Science.gov (United States)

    Riyahi-Alam, S.; Agostini, V.; Molinari, F.; Knaflitz, M.

    2015-03-01

    Automated motion reduction in dynamic infrared imaging is on demand in clinical applications, since movement disarranges time-temperature series of each pixel, thus originating thermal artifacts that might bias the clinical decision. All previously proposed registration methods are feature based algorithms requiring manual intervention. The aim of this work is to optimize the registration strategy specifically for Breast Dynamic Infrared Imaging and to make it user-independent. We implemented and evaluated 3 different 3D time-series registration methods: 1. Linear affine, 2. Non-linear Bspline, 3. Demons applied to 12 datasets of healthy breast thermal images. The results are evaluated through normalized mutual information with average values of 0.70 ±0.03, 0.74 ±0.03 and 0.81 ±0.09 (out of 1) for Affine, Bspline and Demons registration, respectively, as well as breast boundary overlap and Jacobian determinant of the deformation field. The statistical analysis of the results showed that symmetric diffeomorphic Demons' registration method outperforms also with the best breast alignment and non-negative Jacobian values which guarantee image similarity and anatomical consistency of the transformation, due to homologous forces enforcing the pixel geometric disparities to be shortened on all the frames. We propose Demons' registration as an effective technique for time-series dynamic infrared registration, to stabilize the local temperature oscillation.

  13. Intrathoracic pressure regulation during cardiopulmonary resuscitation: a feasibility case-series.

    Science.gov (United States)

    Segal, Nicolas; Parquette, Brent; Ziehr, Jonathon; Yannopoulos, Demetris; Lindstrom, David

    2013-04-01

    Intrathoracic pressure regulation (IPR) is a novel, noninvasive therapy intended to increase cardiac output and blood pressure in hypotensive states by generating a negative end expiratory pressure of -12 cm H2O between positive pressure ventilations. In this first feasibility case-series, we tested the hypothesis that IPR improves End tidal (ET) CO2 during cardiopulmonary resuscitation (CPR). ETCO2 was used as a surrogate measure for circulation. All patients were treated initially with manual CPR and an impedance threshold device (ITD). When IPR-trained medics arrived on scene the ITD was removed and an IPR device (CirQLATOR™) was attached to the patient's advanced airway (intervention group). The IPR device lowered airway pressures to -9 mmHg after each positive pressure ventilation for the duration of the expiratory phase. ETCO2, was measured using a capnometer incorporated into the defibrillator system (LifePak™). Values are expressed as mean ± SEM. Results were compared using paired and unpaired Student's t test. p values of <0.05 were considered statistically significant. ETCO2 values in 11 patients in the case series were compared pre and during IPR therapy and also compared to 74 patients in the control group not treated with the new IPR device. ETCO2 values increased from an average of 21 ± 1 mmHg immediately before IPR application to an average value of 32 ± 5 mmHg and to a maximum value of 45 ± 5mmHg during IPR treatment (p<0.001). In the control group ETCO2 values did not change significantly. Return of spontaneous circulation (ROSC) rates were 46% (34/74) with standard CPR and ITD versus 73% (8/11) with standard CPR and the IPR device (p<0.001). ETCO2 levels and ROSC rates were significantly higher in the study intervention group. These findings demonstrate that during CPR circulation may be significantly augmented by generation of a negative end expiratory pressure between each breath. Copyright © 2012 Elsevier Ireland Ltd. All rights

  14. Comparisons between two wavelet functions in extracting coherent structures from solar wind time series

    International Nuclear Information System (INIS)

    Bolzani, M.J.A.; Guarnieri, F.L.; Vieira, Paulo Cesar

    2009-01-01

    Nowadays, wavelet analysis of turbulent flows have become increasingly popular. However, the study of geometric characteristics from wavelet functions is still poorly explored. In this work we compare the performance of two wavelet functions in extracting the coherent structures from solar wind velocity time series. The data series are from years 1996 to 2002 (except 1998 and 1999). The wavelet algorithm decomposes the annual time-series in two components: the coherent part and non-coherent one, using the daubechies-4 and haar wavelet function. The threshold assumed is based on a percentage of maximum variance found in each dyadic scale. After the extracting procedure, we applied the power spectral density on the original time series and coherent time series to obtain spectral indices. The results from spectral indices show higher values for the coherent part obtained by daubechies-4 than those obtained by the haar wavelet function. Using the kurtosis statistical parameter, on coherent and non-coherent time series, it was possible to conjecture that the differences found between two wavelet functions may be associated with their geometric forms. (author)

  15. The solution of the point kinetics equations via converged accelerated Taylor series (CATS)

    Energy Technology Data Exchange (ETDEWEB)

    Ganapol, B.; Picca, P. [Dept. of Aerospace and Mechanical Engineering, Univ. of Arizona (United States); Previti, A.; Mostacci, D. [Laboratorio di Montecuccolino, Alma Mater Studiorum - Universita di Bologna (Italy)

    2012-07-01

    This paper deals with finding accurate solutions of the point kinetics equations including non-linear feedback, in a fast, efficient and straightforward way. A truncated Taylor series is coupled to continuous analytical continuation to provide the recurrence relations to solve the ordinary differential equations of point kinetics. Non-linear (Wynn-epsilon) and linear (Romberg) convergence accelerations are employed to provide highly accurate results for the evaluation of Taylor series expansions and extrapolated values of neutron and precursor densities at desired edits. The proposed Converged Accelerated Taylor Series, or CATS, algorithm automatically performs successive mesh refinements until the desired accuracy is obtained, making use of the intermediate results for converged initial values at each interval. Numerical performance is evaluated using case studies available from the literature. Nearly perfect agreement is found with the literature results generally considered most accurate. Benchmark quality results are reported for several cases of interest including step, ramp, zigzag and sinusoidal prescribed insertions and insertions with adiabatic Doppler feedback. A larger than usual (9) number of digits is included to encourage honest benchmarking. The benchmark is then applied to the enhanced piecewise constant algorithm (EPCA) currently being developed by the second author. (authors)

  16. Generating material strength standards of aluminum alloys for research reactors. Pt. 1. Yield strength values Sy and tensile strength values Su

    International Nuclear Information System (INIS)

    Tsuji, H.; Miya, K.

    1995-01-01

    Aluminum alloys are frequently used as structural materials for research reactors. The material strength standards, however, such as the yield strength values (S y ), the tensile strength values (S u ) and the design fatigue curve -which are needed to use aluminum alloys as structural materials in ''design by analysis'' - for those materials have not been determined yet. Hence, a series of material tests was performed and the results were statistically analyzed with the aim of generating these material strength standards. This paper, the first in a series on material strength standards of aluminum alloys, describes the aspects of the tensile properties of the standards. The draft standards were compared with MITI no. 501 as well as with the ASME codes, and the trend of the available data also was examined. It was revealed that the draft proposal could be adopted as the material strength standards, and that the values of the draft standards at and above 150 C for A6061-T6 and A6063-T6 could be applied only to the reactor operating conditions III and IV. Also the draft standards have already been adopted in the Science and Technology Agency regulatory guide (standards for structural design of nuclear research plants). (orig.)

  17. The discount rate and the value of remaining years of life

    OpenAIRE

    Hartwick, John

    2008-01-01

    We back out an estimate of a personal discount rate of between 3 and 4 percent for a person with a life expectancy of 74 years who dies at age 30 (or 40) and has a value of statistical life of $6.3 million. Central to these calculations is the series generated by Murphy and Topel of value of life years (the dollar value of consumption plus the dollar value of leisure, with some smoothing for income in retirement). We employ the Makeham "model" of life expectancy in our calculations.

  18. Fourier Series Formalization in ACL2(r

    Directory of Open Access Journals (Sweden)

    Cuong K. Chau

    2015-09-01

    Full Text Available We formalize some basic properties of Fourier series in the logic of ACL2(r, which is a variant of ACL2 that supports reasoning about the real and complex numbers by way of non-standard analysis. More specifically, we extend a framework for formally evaluating definite integrals of real-valued, continuous functions using the Second Fundamental Theorem of Calculus. Our extended framework is also applied to functions containing free arguments. Using this framework, we are able to prove the orthogonality relationships between trigonometric functions, which are the essential properties in Fourier series analysis. The sum rule for definite integrals of indexed sums is also formalized by applying the extended framework along with the First Fundamental Theorem of Calculus and the sum rule for differentiation. The Fourier coefficient formulas of periodic functions are then formalized from the orthogonality relations and the sum rule for integration. Consequently, the uniqueness of Fourier sums is a straightforward corollary. We also present our formalization of the sum rule for definite integrals of infinite series in ACL2(r. Part of this task is to prove the Dini Uniform Convergence Theorem and the continuity of a limit function under certain conditions. A key technique in our proofs of these theorems is to apply the overspill principle from non-standard analysis.

  19. Stabilizing simulations of complex stochastic representations for quantum dynamical systems

    Energy Technology Data Exchange (ETDEWEB)

    Perret, C; Petersen, W P, E-mail: wpp@math.ethz.ch [Seminar for Applied Mathematics, ETH, Zurich (Switzerland)

    2011-03-04

    Path integral representations of quantum dynamics can often be formulated as stochastic differential equations (SDEs). In a series of papers, Corney and Drummond (2004 Phys. Rev. Lett. 93 260401), Deuar and Drummond (2001 Comput. Phys. Commun. 142 442-5), Drummond and Gardnier (1980 J. Phys. A: Math. Gen. 13 2353-68), Gardiner and Zoller (2004 Quantum Noise: A Handbook of Markovian and Non-Markovian Quantum Stochastic Methods with Applications to Quantum Optics (Springer Series in Synergetics) 3rd edn (Berlin: Springer)) and Gilchrist et al (1997 Phys. Rev. A 55 3014-32) and their collaborators have derived SDEs from coherent states representations for density matrices. Computationally, these SDEs are attractive because they seem simple to simulate. They can be quite unstable, however. In this paper, we consider some of the instabilities and propose a few remedies. Particularly, because the variances of the simulated paths typically grow exponentially, the processes become de-localized in relatively short times. Hence, the issues of boundary conditions and stable integration methods become important. We use the Bose-Einstein Hamiltonian as an example. Our results reveal that it is possible to significantly extend integration times and show the periodic structure of certain functionals.

  20. TIME SERIES ANALYSIS ON STOCK MARKET FOR TEXT MINING CORRELATION OF ECONOMY NEWS

    Directory of Open Access Journals (Sweden)

    Sadi Evren SEKER

    2014-01-01

    Full Text Available This paper proposes an information retrieval methodfor the economy news. Theeffect of economy news, are researched in the wordlevel and stock market valuesare considered as the ground proof.The correlation between stock market prices and economy news is an already ad-dressed problem for most of the countries. The mostwell-known approach is ap-plying the text mining approaches to the news and some time series analysis tech-niques over stock market closing values in order toapply classification or cluster-ing algorithms over the features extracted. This study goes further and tries to askthe question what are the available time series analysis techniques for the stockmarket closing values and which one is the most suitable? In this study, the newsand their dates are collected into a database and text mining is applied over thenews, the text mining part has been kept simple with only term frequency – in-verse document frequency method. For the time series analysis part, we havestudied 10 different methods such as random walk, moving average, acceleration,Bollinger band, price rate of change, periodic average, difference, momentum orrelative strength index and their variation. In this study we have also explainedthese techniques in a comparative way and we have applied the methods overTurkish Stock Market closing values for more than a2 year period. On the otherhand, we have applied the term frequency – inversedocument frequency methodon the economy news of one of the high-circulatingnewspapers in Turkey.

  1. GPS Position Time Series @ JPL

    Science.gov (United States)

    Owen, Susan; Moore, Angelyn; Kedar, Sharon; Liu, Zhen; Webb, Frank; Heflin, Mike; Desai, Shailen

    2013-01-01

    Different flavors of GPS time series analysis at JPL - Use same GPS Precise Point Positioning Analysis raw time series - Variations in time series analysis/post-processing driven by different users. center dot JPL Global Time Series/Velocities - researchers studying reference frame, combining with VLBI/SLR/DORIS center dot JPL/SOPAC Combined Time Series/Velocities - crustal deformation for tectonic, volcanic, ground water studies center dot ARIA Time Series/Coseismic Data Products - Hazard monitoring and response focused center dot ARIA data system designed to integrate GPS and InSAR - GPS tropospheric delay used for correcting InSAR - Caltech's GIANT time series analysis uses GPS to correct orbital errors in InSAR - Zhen Liu's talking tomorrow on InSAR Time Series analysis

  2. Implementasi Metode Fuzzy Time Series Cheng untuk prediksi Kosentrasi Gas NO2 Di Udara

    Directory of Open Access Journals (Sweden)

    M Yoka Fathoni

    2017-05-01

    Full Text Available The forecasting process is essential for determining air quality to monitor NO2 gas in the air. The research aims to develop prediction information system of NO2 gas in air. The method used is Fuzzy Time Series Cheng method. The process of acquiring NO2 gas data is integrated with Multichannel-Multistasion. The data acquisition process uses Wireless Sensor Network technology via broadband internet that is sent and stored in an online database form on the web server. Recorded data is used as material for prediction. Acquisition result of  NO2 gas data is obtained from the sensor which is sent to the web server in the data base in the network by on line, then for futher it is predicted using fuzzy time series Cheng applying re-divide to the results of intervals the first partition of the value of the universe of discourse by historical data fuzzification to determine Fuzzy Logical Relationship dan Fuzzy Logical Relationship Group, so that is obtained result value prediction of NO2 gas concentration. By using 36 sample data of NO2 gas, it is obtained that the value of root of mean squared error of 2.08%. This result indicates that the method of Fuzzy Time Series Cheng is good enough to be used in predicting the NO2 gas.

  3. A Time-Series Water Level Forecasting Model Based on Imputation and Variable Selection Method

    OpenAIRE

    Jun-He Yang; Ching-Hsue Cheng; Chia-Pan Chan

    2017-01-01

    Reservoirs are important for households and impact the national economy. This paper proposed a time-series forecasting model based on estimating a missing value followed by variable selection to forecast the reservoir's water level. This study collected data from the Taiwan Shimen Reservoir as well as daily atmospheric data from 2008 to 2015. The two datasets are concatenated into an integrated dataset based on ordering of the data as a research dataset. The proposed time-series forecasting m...

  4. An introduction to Laplace transforms and Fourier series

    CERN Document Server

    Dyke, Phil

    2014-01-01

    Laplace transforms continue to be a very important tool for the engineer, physicist and applied mathematician. They are also now useful to financial, economic and biological modellers as these disciplines become more quantitative. Any problem that has underlying linearity and with solution based on initial values can be expressed as an appropriate differential equation and hence be solved using Laplace transforms. In this book, there is a strong emphasis on application with the necessary mathematical grounding. There are plenty of worked examples with all solutions provided. This enlarged new edition includes generalised Fourier series and a completely new chapter on wavelets. Only knowledge of elementary trigonometry and calculus are required as prerequisites. An Introduction to Laplace Transforms and Fourier Series will be useful for second and third year undergraduate students in engineering, physics or mathematics, as well as for graduates in any discipline such as financial mathematics, econometrics and ...

  5. The power series method in the effectiveness factor calculations

    OpenAIRE

    Filipich, C. P.; Villa, L. T.; Grossi, Ricardo Oscar

    2017-01-01

    In the present paper, exact analytical solutions are obtained for nonlinear ordinary differential equations which appear in complex diffusionreaction processes. A technique based on the power series method is used. Numerical results were computed for a number of cases which correspond to boundary value problems available in the literature. Additionally, new numerical results were generated for several important cases. Fil: Filipich, C. P.. Universidad Tecnológica Nacional. Facultad Regiona...

  6. The lean product design and development journey a practical view

    CERN Document Server

    Pessôa, Marcus Vinicius Pereira

    2017-01-01

    This book presents a series of high performance product design (PD) and development best practices that can create or improve product development organization. In contrast to other books that focus only on Toyota or other individual companies applying lean IPD, this book explains the lean philosophy more broadly and includes discussions of systems engineering, design for X (DFX), agile development, integrated product development, and project management. The “Lean Journey” proposed here takes a value-centric approach, where the lean principles are applied to PD to allow the tools and methods selected to emerge from observation of the individual characteristics of each enterprise. This means that understanding lean product development (LPD) is not about knowing which tools are available but knowing how to apply the philosophy. The book comes with an accompanying manual with problems and solutions available on Springer Extras.

  7. Comparison of zwitterionic N-alkylaminomethanesulfonic acids to related compounds in the Good buffer series

    Directory of Open Access Journals (Sweden)

    Robert D. Long

    2010-04-01

    Full Text Available Several N-alkyl and N,N-dialkylaminomethanesulfonic acids were synthesized (as zwitterions and/or sodium salts to be tested for utility as biological buffers at lower pH levels than existing Good buffer compounds (aminoalkanesulfonates with a minimum of two carbons between amine and sulfonic acid groups as originally described by Norman Good, and in common use as biological buffers. Our hypothesis was that a shorter carbon chain (one carbon between the amino and sulfonic acid groups should lower the ammonium ion pKa values. The alkylaminomethanesulfonate compounds were synthesized in aqueous solution by reaction of primary or secondary amines with formaldehyde/sodium hydrogensulfite addition compound. The pKa values of the ammonium ions of this series of compounds (compared to existing Good buffers was found to correlate well with the length of the carbon chain between the amino and sulfonate moeties, with a significant decrease in amine basicity in the aminomethanesulfonate compounds (pKa decrease of 2 units or more compared to existing Good buffers. An exception was found for the 2-hydroxypiperazine series which shows only a small pKa decrease, probably due to the site of protonation in this compound (as confirmed by X-ray crystal structure. X-ray crystallographic structures of two members of the series are reported. Several of these compounds have pKa values that would indicate potential utility for buffering at pH levels below the normal physiological range (pKa values in the range of 3 to 6 without aqueous solubility problems – a range that is problematic for currently available Good buffers. Unfortunately, the alkylaminomethanesulfonates were found to degrade (with loss of their buffering ability at pH levels below the pKa value and were unstable at elevated temperature (as when autoclaving – thus limiting their utility.

  8. A Hybrid Fuzzy Time Series Approach Based on Fuzzy Clustering and Artificial Neural Network with Single Multiplicative Neuron Model

    Directory of Open Access Journals (Sweden)

    Ozge Cagcag Yolcu

    2013-01-01

    Full Text Available Particularly in recent years, artificial intelligence optimization techniques have been used to make fuzzy time series approaches more systematic and improve forecasting performance. Besides, some fuzzy clustering methods and artificial neural networks with different structures are used in the fuzzification of observations and determination of fuzzy relationships, respectively. In approaches considering the membership values, the membership values are determined subjectively or fuzzy outputs of the system are obtained by considering that there is a relation between membership values in identification of relation. This necessitates defuzzification step and increases the model error. In this study, membership values were obtained more systematically by using Gustafson-Kessel fuzzy clustering technique. The use of artificial neural network with single multiplicative neuron model in identification of fuzzy relation eliminated the architecture selection problem as well as the necessity for defuzzification step by constituting target values from real observations of time series. The training of artificial neural network with single multiplicative neuron model which is used for identification of fuzzy relation step is carried out with particle swarm optimization. The proposed method is implemented using various time series and the results are compared with those of previous studies to demonstrate the performance of the proposed method.

  9. Conservation value of post-mining headwaters: drainage channels at a lignite spoil heap harbour threatened stream fragonflies

    Czech Academy of Sciences Publication Activity Database

    Tichánek, Filip; Tropek, Robert

    2015-01-01

    Roč. 19, č. 5 (2015), s. 975-985 ISSN 1366-638X R&D Projects: GA ČR GAP504/12/2525 Grant - others:GA JU(CZ) 168/2013/P Institutional support: RVO:60077344 Keywords : biodiversity conservation * drainage ditches * freshwater habitats Subject RIV: EH - Ecology, Behaviour Impact factor: 1.431, year: 2015 http://link.springer.com/article/10.1007%2Fs10841-015-9814-1

  10. Effect of noise and filtering on largest Lyapunov exponent of time series associated with human walking.

    Science.gov (United States)

    Mehdizadeh, Sina; Sanjari, Mohammad Ali

    2017-11-07

    This study aimed to determine the effect of added noise, filtering and time series length on the largest Lyapunov exponent (LyE) value calculated for time series obtained from a passive dynamic walker. The simplest passive dynamic walker model comprising of two massless legs connected by a frictionless hinge joint at the hip was adopted to generate walking time series. The generated time series was used to construct a state space with the embedding dimension of 3 and time delay of 100 samples. The LyE was calculated as the exponential rate of divergence of neighboring trajectories of the state space using Rosenstein's algorithm. To determine the effect of noise on LyE values, seven levels of Gaussian white noise (SNR=55-25dB with 5dB steps) were added to the time series. In addition, the filtering was performed using a range of cutoff frequencies from 3Hz to 19Hz with 2Hz steps. The LyE was calculated for both noise-free and noisy time series with different lengths of 6, 50, 100 and 150 strides. Results demonstrated a high percent error in the presence of noise for LyE. Therefore, these observations suggest that Rosenstein's algorithm might not perform well in the presence of added experimental noise. Furthermore, findings indicated that at least 50 walking strides are required to calculate LyE to account for the effect of noise. Finally, observations support that a conservative filtering of the time series with a high cutoff frequency might be more appropriate prior to calculating LyE. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Journal Afrika Statistika ISSN 0852-0305 Extreme value theory for ...

    African Journals Online (AJOL)

    Borel σ-algebra generated by open sets. For x ∈ E and A ∈ E, .... The ground level ozone process has piecewise linear structure. It switches ...... Extreme value theory for a class of nonstationary time series with applications. Ann. Appl. Prob.

  12. Towards Detection of Cutting in Hay Meadows by Using of NDVI and EVI Time Series

    Directory of Open Access Journals (Sweden)

    Andrej Halabuk

    2015-05-01

    Full Text Available The main requirement for preserving European hay meadows in good condition is through prerequisite cut management. However, monitoring these practices on a larger scale is very difficult. Our study analyses the use of MODIS vegetation indices products, namely EVI and NDVI, to discriminate cut and uncut meadows in Slovakia. We tested the added value of simple transformations of raw data series (seasonal statistics, first difference series, compared EVI and NDVI, and analyzed optimal periods, the number of scenes and the effect of smoothing on classification performance. The first difference series transformation saw substantial improvement in classification results. The best case NDVI series classification yielded overall accuracy of 85% with balanced rates of producer’s and user’s accuracies for both classes. EVI yielded slightly lower values, though not significantly different, although user accuracy of cut meadows achieved only 67%. Optimal periods for discriminating cut and uncut meadows lay between 16 May and 4 August, meaning only seven consecutive images are enough to accurately detect cutting in hay meadows. More importantly, the 16-day compositing period seemed to be enough for detection of cutting, which would be the time span that might be hopefully achieved by upcoming on-board HR sensors (e.g., Sentinel-2.

  13. Determination of internal series resistance of PV devices: repeatability and uncertainty

    International Nuclear Information System (INIS)

    Trentadue, Germana; Pavanello, Diego; Salis, Elena; Field, Mike; Müllejans, Harald

    2016-01-01

    The calibration of photovoltaic devices requires the measurement of their current–voltage characteristics at standard test conditions (STC). As the latter can only be reached approximately, a curve translation is necessary, requiring among others the internal series resistance of the photovoltaic device as an input parameter. Therefore accurate and reliable determination of the series resistance is important in measurement and test laboratories. This work follows standard IEC 60891 ed 2 (2009) for the determination of the internal series resistance and investigates repeatability and uncertainty of the result in three aspects for a number of typical photovoltaic technologies. Firstly the effect of varying device temperature on the determined series resistance is determined experimentally and compared to a theoretical derivation showing agreement. It is found that the series resistance can be determined with an uncertainty of better than 5% if the device temperature is stable within  ±0.1 °C, whereas the temperature range of  ±2 °C allowed by the standard leads to much larger variations. Secondly the repeatability of the series resistance determination with respect to noise in current–voltage measurement is examined yielding typical values of  ±5%. Thirdly the determination of the series resistance using three different experimental set-ups (solar simulators) shows agreement on the level of  ±5% for crystalline Silicon photovoltaic devices and deviations up to 15% for thin-film devices. It is concluded that the internal series resistance of photovoltaic devices could be determined with an uncertainty of better than 10%. The influence of this uncertainty in series resistance on the electrical performance parameters of photovoltaic devices was estimated and showed a contribution of 0.05% for open-circuit voltage and 0.1% for maximum power. Furthermore it is concluded that the range of device temperatures allowed during determination of series

  14. Effect of calibration data series length on performance and optimal parameters of hydrological model

    Directory of Open Access Journals (Sweden)

    Chuan-zhe Li

    2010-12-01

    Full Text Available In order to assess the effects of calibration data series length on the performance and optimal parameter values of a hydrological model in ungauged or data-limited catchments (data are non-continuous and fragmental in some catchments, we used non-continuous calibration periods for more independent streamflow data for SIMHYD (simple hydrology model calibration. Nash-Sutcliffe efficiency and percentage water balance error were used as performance measures. The particle swarm optimization (PSO method was used to calibrate the rainfall-runoff models. Different lengths of data series ranging from one year to ten years, randomly sampled, were used to study the impact of calibration data series length. Fifty-five relatively unimpaired catchments located all over Australia with daily precipitation, potential evapotranspiration, and streamflow data were tested to obtain more general conclusions. The results show that longer calibration data series do not necessarily result in better model performance. In general, eight years of data are sufficient to obtain steady estimates of model performance and parameters for the SIMHYD model. It is also shown that most humid catchments require fewer calibration data to obtain a good performance and stable parameter values. The model performs better in humid and semi-humid catchments than in arid catchments. Our results may have useful and interesting implications for the efficiency of using limited observation data for hydrological model calibration in different climates.

  15. Multifractal detrended cross-correlation analysis on gold, crude oil and foreign exchange rate time series

    Science.gov (United States)

    Pal, Mayukha; Madhusudana Rao, P.; Manimaran, P.

    2014-12-01

    We apply the recently developed multifractal detrended cross-correlation analysis method to investigate the cross-correlation behavior and fractal nature between two non-stationary time series. We analyze the daily return price of gold, West Texas Intermediate and Brent crude oil, foreign exchange rate data, over a period of 18 years. The cross correlation has been measured from the Hurst scaling exponents and the singularity spectrum quantitatively. From the results, the existence of multifractal cross-correlation between all of these time series is found. We also found that the cross correlation between gold and oil prices possess uncorrelated behavior and the remaining bivariate time series possess persistent behavior. It was observed for five bivariate series that the cross-correlation exponents are less than the calculated average generalized Hurst exponents (GHE) for q0 and for one bivariate series the cross-correlation exponent is greater than GHE for all q values.

  16. Bias Correction with Jackknife, Bootstrap, and Taylor Series

    OpenAIRE

    Jiao, Jiantao; Han, Yanjun; Weissman, Tsachy

    2017-01-01

    We analyze the bias correction methods using jackknife, bootstrap, and Taylor series. We focus on the binomial model, and consider the problem of bias correction for estimating $f(p)$, where $f \\in C[0,1]$ is arbitrary. We characterize the supremum norm of the bias of general jackknife and bootstrap estimators for any continuous functions, and demonstrate the in delete-$d$ jackknife, different values of $d$ may lead to drastically different behavior in jackknife. We show that in the binomial ...

  17. Changes According to Incubation Periods in Some Microbiological Characteristics at Soil Samples of Some Soil Series from the Gelemen Agricultural Administration

    OpenAIRE

    KARA, Emine Erman

    1998-01-01

    Changes according to incubation periods in some microbiological characteristics at soil samples of soil series from Gelemen Agricultural Administraction were investigated in this study. The results show that bacteria, actinomycet had values in the first periods of incubation (30ºC and field capacity) and in the following periods increased. However, fungus population changed depending upon series properties and reached maximum values 24th and 32th days after the beginning of incubation. During...

  18. All Values Are Equal, But Some Are More Equal Than Others: A Critique and Redefinition of Values Clarification. Monograph Series No. 102 Code C-9.

    Science.gov (United States)

    McKnight, Richard

    This literature review presents the essentials of the theory of Values Clarification and its fundamental terms, and makes a critical analysis of the Values Clarification Theory of Valuation. (Author/JLL)

  19. Highly comparative time-series analysis: the empirical structure of time series and their methods.

    Science.gov (United States)

    Fulcher, Ben D; Little, Max A; Jones, Nick S

    2013-06-06

    The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines.

  20. Geostatistics for Large Datasets

    KAUST Repository

    Sun, Ying

    2011-10-31

    Each chapter should be preceded by an abstract (10–15 lines long) that summarizes the content. The abstract will appear onlineat www.SpringerLink.com and be available with unrestricted access. This allows unregistered users to read the abstract as a teaser for the complete chapter. As a general rule the abstracts will not appear in the printed version of your book unless it is the style of your particular book or that of the series to which your book belongs. Please use the ’starred’ version of the new Springer abstractcommand for typesetting the text of the online abstracts (cf. source file of this chapter template abstract) and include them with the source files of your manuscript. Use the plain abstractcommand if the abstract is also to appear in the printed version of the book.

  1. Geostatistics for Large Datasets

    KAUST Repository

    Sun, Ying; Li, Bo; Genton, Marc G.

    2011-01-01

    Each chapter should be preceded by an abstract (10–15 lines long) that summarizes the content. The abstract will appear onlineat www.SpringerLink.com and be available with unrestricted access. This allows unregistered users to read the abstract as a teaser for the complete chapter. As a general rule the abstracts will not appear in the printed version of your book unless it is the style of your particular book or that of the series to which your book belongs. Please use the ’starred’ version of the new Springer abstractcommand for typesetting the text of the online abstracts (cf. source file of this chapter template abstract) and include them with the source files of your manuscript. Use the plain abstractcommand if the abstract is also to appear in the printed version of the book.

  2. Common pitfalls in statistical analysis: "P" values, statistical significance and confidence intervals

    Directory of Open Access Journals (Sweden)

    Priya Ranganathan

    2015-01-01

    Full Text Available In the second part of a series on pitfalls in statistical analysis, we look at various ways in which a statistically significant study result can be expressed. We debunk some of the myths regarding the ′P′ value, explain the importance of ′confidence intervals′ and clarify the importance of including both values in a paper

  3. Value and Crisis: Bichler and Nitzan versus Marx

    Directory of Open Access Journals (Sweden)

    Andrew Kliman

    2011-04-01

    Full Text Available In this article, Andrew Kliman responds to Bichler and Nitzan’s recent paper on‘Systemic Fear, Modern Finance and the Future of Capitalism’ (2010. He then goes on to raise a series of issues concerning the critique of Marxian value theory which these authors put forward in their book Capital as Power (Nitzan and Bichler, 2009.

  4. The effect of distal utility value intervention for students’ learning

    OpenAIRE

    KERA, Masaki; NAKAYA, Motoyuki

    2017-01-01

    The purpose of this study was to determine whether a utility value intervention influenced students’motivation and performance. Specifically, we examined the effect of distal utility value (i.e., the recognition of content usefulness for skill development that can improve daily and future endeavors) instruction in this study.Fifty-one Japanese undergraduate students completed an experimental session in the laboratory, in which they performed a series of logical reasoning problem-solving tasks...

  5. Fractal dimension algorithms and their application to time series associated with natural phenomena

    International Nuclear Information System (INIS)

    La Torre, F Cervantes-De; González-Trejo, J I; Real-Ramírez, C A; Hoyos-Reyes, L F

    2013-01-01

    Chaotic invariants like the fractal dimensions are used to characterize non-linear time series. The fractal dimension is an important characteristic of systems, because it contains information about their geometrical structure at multiple scales. In this work, three algorithms are applied to non-linear time series: spectral analysis, rescaled range analysis and Higuchi's algorithm. The analyzed time series are associated with natural phenomena. The disturbance storm time (Dst) is a global indicator of the state of the Earth's geomagnetic activity. The time series used in this work show a self-similar behavior, which depends on the time scale of measurements. It is also observed that fractal dimensions, D, calculated with Higuchi's method may not be constant over-all time scales. This work shows that during 2001, D reaches its lowest values in March and November. The possibility that D recovers a change pattern arising from self-organized critical phenomena is also discussed

  6. Portfolio Optimization under Local-Stochastic Volatility: Coefficient Taylor Series Approximations & Implied Sharpe Ratio

    OpenAIRE

    Lorig, Matthew; Sircar, Ronnie

    2015-01-01

    We study the finite horizon Merton portfolio optimization problem in a general local-stochastic volatility setting. Using model coefficient expansion techniques, we derive approximations for the both the value function and the optimal investment strategy. We also analyze the `implied Sharpe ratio' and derive a series approximation for this quantity. The zeroth-order approximation of the value function and optimal investment strategy correspond to those obtained by Merton (1969) when the risky...

  7. The Economic Value of Human Capital

    OpenAIRE

    Gulie Alexandra Emanuela

    2012-01-01

    The human factor created by physical work and/or intellectual property of all existing material, is unequivocally active value of any work, ie human capital translates into different activities, specialized or not, it creates these individuals. History of the term human capital has experienced over time a series of ups and downs, as agreed or rejected by academia and the political class. Although known affirmation and its conceptual structure only after the seventh decade of the twentieth cen...

  8. Transient voltage sharing in series-coupled high voltage switches

    Directory of Open Access Journals (Sweden)

    Editorial Office

    1992-07-01

    Full Text Available For switching voltages in excess of the maximum blocking voltage of a switching element (for example, thyristor, MOSFET or bipolar transistor such elements are often coupled in series - and additional circuitry has to be provided to ensure equal voltage sharing. Between each such series element and system ground there is a certain parasitic capacitance that may draw a significant current during high-speed voltage transients. The "open" switch is modelled as a ladder network. Analy­sis reveals an exponential progression in the distribution of the applied voltage across the elements. Overstressing thus oc­curs in some of the elements at levels of the total voltage that are significantly below the design value. This difficulty is overcome by grading the voltage sharing circuitry, coupled in parallel with each element, in a prescribed manner, as set out here.

  9. Generic clearance values

    International Nuclear Information System (INIS)

    Bossio, M.C.; Muniz, C.C.

    2009-01-01

    This paper analyzes the Generic Clearance Values established for natural and artificial radionuclides with the objective of evaluating their degree of conservatism in views of adopting them into the regulatory body. Generic clearance values for natural radionuclides have been chosen by experts judgments as the optimum boundary between, on one hand, the ubiquitous unmodified soil concentrations and, on the other hand, activity concentrations in ores, mineral sands, industrial residues and wastes. For artificial radionuclides the clearance levels have been derived from the scenarios postulated in the document 'Safety Reports Series Nr 44' of the IAEA considering quantitative exemption criteria. A set of 8 scenarios were postulated covering external, ingestion and inhalation exposure pathways. For each radionuclide, the generic clearance level was derived as the more restrictive value obtained from the scenarios, that is the lowest ratio between the applicable individual dose and the dose per unit activity concentration (Bq/g). The individual dose was calculated by a formula depending on each scenario and pathway, with different parameters, such as exposure time, dosimetric factors, dilution factor, density of the material, geometric factors, etc. It was concluded that the basis and parameters used for the derivation of the generic clearance levels are quite conservative and therefore its the adoption in Argentina has been recommended. It is expected that their implementation will contribute to optimize the regulatory management system. (author)

  10. Generic Clearance Values

    International Nuclear Information System (INIS)

    Bossio, M.C.; Muniz, C.C.

    2010-01-01

    This paper analyzes the Generic Clearance Values established for natural and artificial radionuclides with the objective of evaluating their degree of conservatism in views of adopting them into the regulatory body. Generic clearance values for natural radionuclides have been chosen by experts judgments as the optimum boundary between, on the one hand, the ubiquitous unmodified soil concentrations and, on the other hand, activity concentrations in ores, mineral sands, industrial residues and wastes. For artificial radionuclides the clearance levels have been derived from the scenarios postulated in the document Safety Reports Series 44 of the IAEA considering quantitative exemption criteria. A set of 8 scenarios were postulated covering external, ingestion and inhalation exposure pathways. For each radionuclide, the generic clearance level was derived as the more restrictive value obtained from the scenarios, that is the lowest ratio between the applicable individual dose and the dose per unit activity concentration (Bq/g). The individual dose was calculated by a formula depending on each scenario and pathway, with different parameters, such as exposure time, dosimetric factors, dilution factor, density of the material, geometric factors, etc. It was concluded that the basis and parameters used for the derivation of the generic clearance levels are quite conservative and therefore its the adoption in Argentina has been recommended. It is expected that their implementation will contribute to optimize the regulatory management system. (authors) [es

  11. Recurrent Neural Network For Forecasting Time Series With Long Memory Pattern

    Science.gov (United States)

    Walid; Alamsyah

    2017-04-01

    Recurrent Neural Network as one of the hybrid models are often used to predict and estimate the issues related to electricity, can be used to describe the cause of the swelling of electrical load which experienced by PLN. In this research will be developed RNN forecasting procedures at the time series with long memory patterns. Considering the application is the national electrical load which of course has a different trend with the condition of the electrical load in any country. This research produces the algorithm of time series forecasting which has long memory pattern using E-RNN after this referred to the algorithm of integrated fractional recurrent neural networks (FIRNN).The prediction results of long memory time series using models Fractional Integrated Recurrent Neural Network (FIRNN) showed that the model with the selection of data difference in the range of [-1,1] and the model of Fractional Integrated Recurrent Neural Network (FIRNN) (24,6,1) provides the smallest MSE value, which is 0.00149684.

  12. Novel series of 1,2,4-trioxane derivatives as antimalarial agents.

    Science.gov (United States)

    Rudrapal, Mithun; Chetia, Dipak; Singh, Vineeta

    2017-12-01

    Among three series of 1,2,4-trioxane derivatives, five compounds showed good in vitro antimalarial activity, three compounds of which exhibited better activity against P. falciparum resistant (RKL9) strain than the sensitive (3D7) one. Two best compounds were one from aryl series and the other from heteroaryl series with IC 50 values of 1.24 µM and 1.24 µM and 1.06 µM and 1.17 µM, against sensitive and resistant strains, respectively. Further, trioxane derivatives exhibited good binding affinity for the P. falciparum cysteine protease falcipain 2 receptor (PDB id: 3BPF) with well defined drug-like and pharmacokinetic properties based on Lipinski's rule of five with additional physicochemical and ADMET parameters. In view of having antimalarial potential, 1,2,4-trioxane derivative(s) reported herein may be useful as novel antimalarial lead(s) in the discovery and development of future antimalarial drug candidates as P. falciparum falcipain 2 inhibitors against resistant malaria.

  13. Time series analysis for psychological research: examining and forecasting change.

    Science.gov (United States)

    Jebb, Andrew T; Tay, Louis; Wang, Wei; Huang, Qiming

    2015-01-01

    Psychological research has increasingly recognized the importance of integrating temporal dynamics into its theories, and innovations in longitudinal designs and analyses have allowed such theories to be formalized and tested. However, psychological researchers may be relatively unequipped to analyze such data, given its many characteristics and the general complexities involved in longitudinal modeling. The current paper introduces time series analysis to psychological research, an analytic domain that has been essential for understanding and predicting the behavior of variables across many diverse fields. First, the characteristics of time series data are discussed. Second, different time series modeling techniques are surveyed that can address various topics of interest to psychological researchers, including describing the pattern of change in a variable, modeling seasonal effects, assessing the immediate and long-term impact of a salient event, and forecasting future values. To illustrate these methods, an illustrative example based on online job search behavior is used throughout the paper, and a software tutorial in R for these analyses is provided in the Supplementary Materials.

  14. Time series analysis for psychological research: examining and forecasting change

    Science.gov (United States)

    Jebb, Andrew T.; Tay, Louis; Wang, Wei; Huang, Qiming

    2015-01-01

    Psychological research has increasingly recognized the importance of integrating temporal dynamics into its theories, and innovations in longitudinal designs and analyses have allowed such theories to be formalized and tested. However, psychological researchers may be relatively unequipped to analyze such data, given its many characteristics and the general complexities involved in longitudinal modeling. The current paper introduces time series analysis to psychological research, an analytic domain that has been essential for understanding and predicting the behavior of variables across many diverse fields. First, the characteristics of time series data are discussed. Second, different time series modeling techniques are surveyed that can address various topics of interest to psychological researchers, including describing the pattern of change in a variable, modeling seasonal effects, assessing the immediate and long-term impact of a salient event, and forecasting future values. To illustrate these methods, an illustrative example based on online job search behavior is used throughout the paper, and a software tutorial in R for these analyses is provided in the Supplementary Materials. PMID:26106341

  15. Definition of distance for nonlinear time series analysis of marked point process data

    Energy Technology Data Exchange (ETDEWEB)

    Iwayama, Koji, E-mail: koji@sat.t.u-tokyo.ac.jp [Research Institute for Food and Agriculture, Ryukoku Univeristy, 1-5 Yokotani, Seta Oe-cho, Otsu-Shi, Shiga 520-2194 (Japan); Hirata, Yoshito; Aihara, Kazuyuki [Institute of Industrial Science, The University of Tokyo, 4-6-1 Komaba, Meguro-ku, Tokyo 153-8505 (Japan)

    2017-01-30

    Marked point process data are time series of discrete events accompanied with some values, such as economic trades, earthquakes, and lightnings. A distance for marked point process data allows us to apply nonlinear time series analysis to such data. We propose a distance for marked point process data which can be calculated much faster than the existing distance when the number of marks is small. Furthermore, under some assumptions, the Kullback–Leibler divergences between posterior distributions for neighbors defined by this distance are small. We performed some numerical simulations showing that analysis based on the proposed distance is effective. - Highlights: • A new distance for marked point process data is proposed. • The distance can be computed fast enough for a small number of marks. • The method to optimize parameter values of the distance is also proposed. • Numerical simulations indicate that the analysis based on the distance is effective.

  16. Indigenous language and the preservation of African values; the ...

    African Journals Online (AJOL)

    Language is not only a medium of communication but also amatrix through which the culture, value, norms and mores of a people are transmitted. Unfortunately for the African, the Indigenous Language of the continent were as a result of European Imperialist motives subjected to a series distorting and negative influences.

  17. Comment on ‘Series expansions from the corner transfer matrix renormalization group method: the hard-squares model’

    International Nuclear Information System (INIS)

    Jensen, Iwan

    2012-01-01

    Earlier this year Chan extended the low-density series for the hard-squares partition function κ(z) to 92 terms. Here we analyse this extended series focusing on the behaviour at the dominant singularity z d which lies on the negative fugacity axis. We find that the series has a confluent singularity of order at least 2 at z d with exponents θ = 0.833 33(2) and θ′ = 1.6676(3). We thus confirm that the exponent θ has the exact value 5/6 as observed by Dhar. (comment)

  18. MRI prognostic value in multiple myeloma

    International Nuclear Information System (INIS)

    Medina, G.; Mazzuco, J.; Corrado, C.; Hendler, M.

    2001-01-01

    Purpose: To evaluate the prognosis value of the bone marrow (BM) MR infiltration patterns in patients with MM, without treatment, and to assess the results altogether with other prognosis factors (clinical, laboratory, radiology). Materials and methods: MR was performed in 21 patients with MM. Bone lesions, plasmacytomas and osteoporosis were assessed with skeletal radiographs. Results: MR of patients with MM revealed 3 abnormal BM involvement patterns: focal (FP), diffuse (DP) and variegated (VP). Unlike others reported series, we identified MR studies with 2 or more MR patterns in 13 patients. We define this particular type as mixed pattern (MP). Conclusion: MR is more sensitive than conventional radiology in detecting bone lesions. The MR patterns combined with the other conventional staging factors provide a useful tool for disease prognosis. In our series, patients with DP, alone or combined with other patterns (mixed), were related with the worst prognosis. (author)

  19. Prediction about chaotic times series of natural circulation flow under rolling motion

    International Nuclear Information System (INIS)

    Yuan Can; Cai Qi; Guo Li; Yan Feng

    2014-01-01

    The paper have proposed a chaotic time series prediction model, which combined phase space reconstruction with support vector machines. The model has been used to predict the coolant volume flow, in which a synchronous parameter optimization method was brought up based on particle swarm optimization algorithm, since the numerical value selection of related parameter was a key factor for the prediction precision. The average relative error of prediction values and actual observation values was l,5% and relative precision was 0.9879. The result indicated that the model could apply for the natural circulation coolant volume flow prediction under rolling motion condition with high accuracy and robustness. (authors)

  20. Excused and Unexcused--The Value of Labeling an Absence. Chronic Absenteeism in Oregon Elementary Schools. Part 4 of 4. September 2016. Research Brief Series

    Science.gov (United States)

    Oregon Department of Education, 2016

    2016-01-01

    This four part series of research briefs summarized detailed analysis of attendance and chronic absenteeism in Oregon. Brief 1 highlighted the importance of tracking chronic absenteeism rather than average daily attendance. The second brief in this series focused on student outcomes and attendance. Research suggests, and Oregon Department of…

  1. Short-circuit testing of monofilar Bi-2212 coils connected in series and in parallel

    International Nuclear Information System (INIS)

    Polasek, A; Dias, R; Serra, E T; Filho, O O; Niedu, D

    2010-01-01

    Superconducting Fault Current Limiters (SCFCL's) are one of the most promising technologies for fault current limitation. In the present work, resistive SCFCL components based on Bi-2212 monofilar coils are subjected to short-circuit testing. These SCFCL components can be easily connected in series and/or in parallel by using joints and clamps. This allows a considerable flexibility to developing larger SCFCL devices, since the configuration and size of the whole device can be easily adapted to the operational conditions. The single components presented critical current (Ic) values of 240-260 A, at 77 K. Short-circuits during 40-120 ms were applied. A single component can withstand a voltage drop of 126-252 V (0.3-0.6 V/cm). Components connected in series withstand higher voltage levels, whereas parallel connection allows higher rated currents during normal operation, but the limited current is also higher. Prospective currents as high as 10-40 kA (peak value) were limited to 3-9 kA (peak value) in the first half cycle.

  2. A prediction method based on wavelet transform and multiple models fusion for chaotic time series

    International Nuclear Information System (INIS)

    Zhongda, Tian; Shujiang, Li; Yanhong, Wang; Yi, Sha

    2017-01-01

    In order to improve the prediction accuracy of chaotic time series, a prediction method based on wavelet transform and multiple models fusion is proposed. The chaotic time series is decomposed and reconstructed by wavelet transform, and approximate components and detail components are obtained. According to different characteristics of each component, least squares support vector machine (LSSVM) is used as predictive model for approximation components. At the same time, an improved free search algorithm is utilized for predictive model parameters optimization. Auto regressive integrated moving average model (ARIMA) is used as predictive model for detail components. The multiple prediction model predictive values are fusion by Gauss–Markov algorithm, the error variance of predicted results after fusion is less than the single model, the prediction accuracy is improved. The simulation results are compared through two typical chaotic time series include Lorenz time series and Mackey–Glass time series. The simulation results show that the prediction method in this paper has a better prediction.

  3. DETERMINING THE NEED FOR ZERO SERIES EXECUTION IN MANUFACTURING PROCESSES IN THE TEXTILE GARMENT INDUSTRY

    Directory of Open Access Journals (Sweden)

    OANA Ioan Pave

    2017-05-01

    Full Text Available Because the industrial production requires the application of some transformation procedures on the material resources, so that a clothing product comes out with optimal use value in terms of maximum economic efficiency, one of the main influencial factors is the quality of the products. To make manufacturing processes more efficient, it is necessary to carry out the zero series in order to ensure the quality of the technological processes, as well as to prevent some design deficiencies. Among the main operations undertaken to ensure the quality of the zero series, we mention: creating the conditions for launch, tracking and finalizing the accompanying production documents under similar series production conditions; zero-series producers are usually the same workers who make up the series production line; equipping with the appropriate equipment and providing with necessary devices in order to create the technical conditions for the execution of the zero series; providing technical assistance in relation to manufacturing and control documentation for eliminating the design deficiencies. This paper presents the architecture of the zero series execution in manufacturing processes in the textile garment industry. The information obtained from the zero-series analysis is directed to the technical support, for possible corrections of the patterns according to which the products were manufactured.

  4. Forecasting malaria cases using climatic factors in delhi, India: a time series analysis.

    Science.gov (United States)

    Kumar, Varun; Mangal, Abha; Panesar, Sanjeet; Yadav, Geeta; Talwar, Richa; Raut, Deepak; Singh, Saudan

    2014-01-01

    Background. Malaria still remains a public health problem in developing countries and changing environmental and climatic factors pose the biggest challenge in fighting against the scourge of malaria. Therefore, the study was designed to forecast malaria cases using climatic factors as predictors in Delhi, India. Methods. The total number of monthly cases of malaria slide positives occurring from January 2006 to December 2013 was taken from the register maintained at the malaria clinic at Rural Health Training Centre (RHTC), Najafgarh, Delhi. Climatic data of monthly mean rainfall, relative humidity, and mean maximum temperature were taken from Regional Meteorological Centre, Delhi. Expert modeler of SPSS ver. 21 was used for analyzing the time series data. Results. Autoregressive integrated moving average, ARIMA (0,1,1) (0,1,0)(12), was the best fit model and it could explain 72.5% variability in the time series data. Rainfall (P value = 0.004) and relative humidity (P value = 0.001) were found to be significant predictors for malaria transmission in the study area. Seasonal adjusted factor (SAF) for malaria cases shows peak during the months of August and September. Conclusion. ARIMA models of time series analysis is a simple and reliable tool for producing reliable forecasts for malaria in Delhi, India.

  5. Economic Value of Army Foreign Military Sales

    Science.gov (United States)

    2015-12-01

    USASAC leads the AMC Security Assistance Enterprise, develops and manages security assistance programs and foreign military sales cases to build...that leads to cost savings and cost avoidance. The Shadow’s FMS sales are currently 1.6% of the total units in operation and accounts for the same...SPONSORED REPORT SERIES Economic Value of Army Foreign Military Sales December 2015 MAJ James P. Allen, USA MAJ Scott A. Bailey, USA CPT

  6. Geometric Series via Probability

    Science.gov (United States)

    Tesman, Barry

    2012-01-01

    Infinite series is a challenging topic in the undergraduate mathematics curriculum for many students. In fact, there is a vast literature in mathematics education research on convergence issues. One of the most important types of infinite series is the geometric series. Their beauty lies in the fact that they can be evaluated explicitly and that…

  7. Emerging interdependence between stock values during financial crashes.

    Directory of Open Access Journals (Sweden)

    Jacopo Rocchi

    Full Text Available To identify emerging interdependencies between traded stocks we investigate the behavior of the stocks of FTSE 100 companies in the period 2000-2015, by looking at daily stock values. Exploiting the power of information theoretical measures to extract direct influences between multiple time series, we compute the information flow across stock values to identify several different regimes. While small information flows is detected in most of the period, a dramatically different situation occurs in the proximity of global financial crises, where stock values exhibit strong and substantial interdependence for a prolonged period. This behavior is consistent with what one would generally expect from a complex system near criticality in physical systems, showing the long lasting effects of crashes on stock markets.

  8. Emerging interdependence between stock values during financial crashes.

    Science.gov (United States)

    Rocchi, Jacopo; Tsui, Enoch Yan Lok; Saad, David

    2017-01-01

    To identify emerging interdependencies between traded stocks we investigate the behavior of the stocks of FTSE 100 companies in the period 2000-2015, by looking at daily stock values. Exploiting the power of information theoretical measures to extract direct influences between multiple time series, we compute the information flow across stock values to identify several different regimes. While small information flows is detected in most of the period, a dramatically different situation occurs in the proximity of global financial crises, where stock values exhibit strong and substantial interdependence for a prolonged period. This behavior is consistent with what one would generally expect from a complex system near criticality in physical systems, showing the long lasting effects of crashes on stock markets.

  9. VEHICLES REGISTERED IN THE FRENCH SPECIAL SERIES '431 K...' AND '431 CD...'

    CERN Multimedia

    Relations with the Host States Service

    2002-01-01

    1. Registration a) Entitlement Only members of the personnel holding a Carte spéciale (AT or FI series) or a Carte diplomatique (CD series), issued by the French Ministry of Foreign Affairs (hereinafter 'MAE'), are entitled to register vehicles in the '431 K ...' or '431 CD ...' special series (green plates). It is compulsory to register in one of these series vehicles that have been purchased or imported free of tax and/or customs duty. In the event of standard registration at a later date, the duty and/or tax will have to be paid on the basis of the currently applicable rates and the vehicle's residual value. Registration in one of these series is optional if the duty and/or tax has been paid in a European Union Member State. In this case, you are strongly recommended to retain documentary evidence of payment and pass this on to any subsequent buyer of the vehicle. This documentation will be required if a standard registration needs to be issued. Without it, the duty and/or tax will have to be pa...

  10. Measuring Nursing Value from the Electronic Health Record.

    Science.gov (United States)

    Welton, John M; Harper, Ellen M

    2016-01-01

    We report the findings of a big data nursing value expert group made up of 14 members of the nursing informatics, leadership, academic and research communities within the United States tasked with 1. Defining nursing value, 2. Developing a common data model and metrics for nursing care value, and 3. Developing nursing business intelligence tools using the nursing value data set. This work is a component of the Big Data and Nursing Knowledge Development conference series sponsored by the University Of Minnesota School Of Nursing. The panel met by conference calls for fourteen 1.5 hour sessions for a total of 21 total hours of interaction from August 2014 through May 2015. Primary deliverables from the bit data expert group were: development and publication of definitions and metrics for nursing value; construction of a common data model to extract key data from electronic health records; and measures of nursing costs and finance to provide a basis for developing nursing business intelligence and analysis systems.

  11. International Work-Conference on Time Series

    CERN Document Server

    Pomares, Héctor; Valenzuela, Olga

    2017-01-01

    This volume of selected and peer-reviewed contributions on the latest developments in time series analysis and forecasting updates the reader on topics such as analysis of irregularly sampled time series, multi-scale analysis of univariate and multivariate time series, linear and non-linear time series models, advanced time series forecasting methods, applications in time series analysis and forecasting, advanced methods and online learning in time series and high-dimensional and complex/big data time series. The contributions were originally presented at the International Work-Conference on Time Series, ITISE 2016, held in Granada, Spain, June 27-29, 2016. The series of ITISE conferences provides a forum for scientists, engineers, educators and students to discuss the latest ideas and implementations in the foundations, theory, models and applications in the field of time series analysis and forecasting.  It focuses on interdisciplinary and multidisciplinary rese arch encompassing the disciplines of comput...

  12. SERI Wind Energy Program

    Energy Technology Data Exchange (ETDEWEB)

    Noun, R. J.

    1983-06-01

    The SERI Wind Energy Program manages the areas or innovative research, wind systems analysis, and environmental compatibility for the U.S. Department of Energy. Since 1978, SERI wind program staff have conducted in-house aerodynamic and engineering analyses of novel concepts for wind energy conversion and have managed over 20 subcontracts to determine technical feasibility; the most promising of these concepts is the passive blade cyclic pitch control project. In the area of systems analysis, the SERI program has analyzed the impact of intermittent generation on the reliability of electric utility systems using standard utility planning models. SERI has also conducted methodology assessments. Environmental issues related to television interference and acoustic noise from large wind turbines have been addressed. SERI has identified the causes, effects, and potential control of acoustic noise emissions from large wind turbines.

  13. Record statistics of financial time series and geometric random walks.

    Science.gov (United States)

    Sabir, Behlool; Santhanam, M S

    2014-09-01

    The study of record statistics of correlated series in physics, such as random walks, is gaining momentum, and several analytical results have been obtained in the past few years. In this work, we study the record statistics of correlated empirical data for which random walk models have relevance. We obtain results for the records statistics of select stock market data and the geometric random walk, primarily through simulations. We show that the distribution of the age of records is a power law with the exponent α lying in the range 1.5≤α≤1.8. Further, the longest record ages follow the Fréchet distribution of extreme value theory. The records statistics of geometric random walk series is in good agreement with that obtained from empirical stock data.

  14. First and second order Markov chain models for synthetic generation of wind speed time series

    International Nuclear Information System (INIS)

    Shamshad, A.; Bawadi, M.A.; Wan Hussin, W.M.A.; Majid, T.A.; Sanusi, S.A.M.

    2005-01-01

    Hourly wind speed time series data of two meteorological stations in Malaysia have been used for stochastic generation of wind speed data using the transition matrix approach of the Markov chain process. The transition probability matrices have been formed using two different approaches: the first approach involves the use of the first order transition probability matrix of a Markov chain, and the second involves the use of a second order transition probability matrix that uses the current and preceding values to describe the next wind speed value. The algorithm to generate the wind speed time series from the transition probability matrices is described. Uniform random number generators have been used for transition between successive time states and within state wind speed values. The ability of each approach to retain the statistical properties of the generated speed is compared with the observed ones. The main statistical properties used for this purpose are mean, standard deviation, median, percentiles, Weibull distribution parameters, autocorrelations and spectral density of wind speed values. The comparison of the observed wind speed and the synthetically generated ones shows that the statistical characteristics are satisfactorily preserved

  15. Permutation entropy based time series analysis: Equalities in the input signal can lead to false conclusions

    Energy Technology Data Exchange (ETDEWEB)

    Zunino, Luciano, E-mail: lucianoz@ciop.unlp.edu.ar [Centro de Investigaciones Ópticas (CONICET La Plata – CIC), C.C. 3, 1897 Gonnet (Argentina); Departamento de Ciencias Básicas, Facultad de Ingeniería, Universidad Nacional de La Plata (UNLP), 1900 La Plata (Argentina); Olivares, Felipe, E-mail: olivaresfe@gmail.com [Instituto de Física, Pontificia Universidad Católica de Valparaíso (PUCV), 23-40025 Valparaíso (Chile); Scholkmann, Felix, E-mail: Felix.Scholkmann@gmail.com [Research Office for Complex Physical and Biological Systems (ROCoS), Mutschellenstr. 179, 8038 Zurich (Switzerland); Biomedical Optics Research Laboratory, Department of Neonatology, University Hospital Zurich, University of Zurich, 8091 Zurich (Switzerland); Rosso, Osvaldo A., E-mail: oarosso@gmail.com [Instituto de Física, Universidade Federal de Alagoas (UFAL), BR 104 Norte km 97, 57072-970, Maceió, Alagoas (Brazil); Instituto Tecnológico de Buenos Aires (ITBA) and CONICET, C1106ACD, Av. Eduardo Madero 399, Ciudad Autónoma de Buenos Aires (Argentina); Complex Systems Group, Facultad de Ingeniería y Ciencias Aplicadas, Universidad de los Andes, Av. Mons. Álvaro del Portillo 12.455, Las Condes, Santiago (Chile)

    2017-06-15

    A symbolic encoding scheme, based on the ordinal relation between the amplitude of neighboring values of a given data sequence, should be implemented before estimating the permutation entropy. Consequently, equalities in the analyzed signal, i.e. repeated equal values, deserve special attention and treatment. In this work, we carefully study the effect that the presence of equalities has on permutation entropy estimated values when these ties are symbolized, as it is commonly done, according to their order of appearance. On the one hand, the analysis of computer-generated time series is initially developed to understand the incidence of repeated values on permutation entropy estimations in controlled scenarios. The presence of temporal correlations is erroneously concluded when true pseudorandom time series with low amplitude resolutions are considered. On the other hand, the analysis of real-world data is included to illustrate how the presence of a significant number of equal values can give rise to false conclusions regarding the underlying temporal structures in practical contexts. - Highlights: • Impact of repeated values in a signal when estimating permutation entropy is studied. • Numerical and experimental tests are included for characterizing this limitation. • Non-negligible temporal correlations can be spuriously concluded by repeated values. • Data digitized with low amplitude resolutions could be especially affected. • Analysis with shuffled realizations can help to overcome this limitation.

  16. Relative Radiometric Normalization and Atmospheric Correction of a SPOT 5 Time Series

    Directory of Open Access Journals (Sweden)

    Matthieu Rumeau

    2008-04-01

    Full Text Available Multi-temporal images acquired at high spatial and temporal resolution are an important tool for detecting change and analyzing trends, especially in agricultural applications. However, to insure a reliable use of this kind of data, a rigorous radiometric normalization step is required. Normalization can be addressed by performing an atmospheric correction of each image in the time series. The main problem is the difficulty of obtaining an atmospheric characterization at a given acquisition date. In this paper, we investigate whether relative radiometric normalization can substitute for atmospheric correction. We develop an automatic method for relative radiometric normalization based on calculating linear regressions between unnormalized and reference images. Regressions are obtained using the reflectances of automatically selected invariant targets. We compare this method with an atmospheric correction method that uses the 6S model. The performances of both methods are compared using 18 images from of a SPOT 5 time series acquired over Reunion Island. Results obtained for a set of manually selected invariant targets show excellent agreement between the two methods in all spectral bands: values of the coefficient of determination (r² exceed 0.960, and bias magnitude values are less than 2.65. There is also a strong correlation between normalized NDVI values of sugarcane fields (r² = 0.959. Despite a relative error of 12.66% between values, very comparable NDVI patterns are observed.

  17. Activation barriers for series of exothermic homologous reactions. V. Boron group diatomic species reactions

    Science.gov (United States)

    Blue, Alan S.; Belyung, David P.; Fontijn, Arthur

    1997-09-01

    Semiempirical configuration interaction (SECI) theory is used to predict activation barriers E, as defined by k(T)=ATn exp(-E/RT). Previously SECI has been applied to homologous series of oxidation reactions of s1, s2, and s2p1 metal atoms. Here it is extended to oxidation reactions of diatomic molecules containing one s2p1 atom. E values are calculated for the reactions of BH, BF, BCl, AlF, AlCl, AlBr, GaF, GaI, InCl, InBr, InI, TlF, TlCl, TlBr, and TlI with O2, CO2, SO2, or N2O. These values correlate with the sums of the ionization potentials and Σ-Π promotion energies of the former minus the electron affinities of the latter. In the earlier work n was chosen somewhat arbitrarily, which affected the absolute values of E. Here it is shown that examination of available experimental and theoretical results allows determination of the best values of n. Using this approach yields n=1.9 for the present series. For the seven reactions which have been studied experimentally, the average deviation of the SECI activation barrier prediction from experiment is 4.0 kJ mol-1. Energy barriers are calculated for another 52 reactions.

  18. Integrated Campaign Probabilistic Cost, Schedule, Performance, and Value for Program Office Support

    Science.gov (United States)

    Cornelius, David; Sasamoto, Washito; Daugherty, Kevin; Deacon, Shaun

    2012-01-01

    This paper describes an integrated assessment tool developed at NASA Langley Research Center that incorporates probabilistic analysis of life cycle cost, schedule, launch performance, on-orbit performance, and value across a series of planned space-based missions, or campaign. Originally designed as an aid in planning the execution of missions to accomplish the National Research Council 2007 Earth Science Decadal Survey, it utilizes Monte Carlo simulation of a series of space missions for assessment of resource requirements and expected return on investment. Interactions between simulated missions are incorporated, such as competition for launch site manifest, to capture unexpected and non-linear system behaviors. A novel value model is utilized to provide an assessment of the probabilistic return on investment. A demonstration case is discussed to illustrate the tool utility.

  19. A study of regional trends in annual and seasonal precipitation and runoff series

    Energy Technology Data Exchange (ETDEWEB)

    Tveito, O.E.; Hisdal, H.

    1994-03-10

    In this study long and homogeneous time series of runoff and precipitation are studied to identify variations in time and space. The method of empirical orthogonal functions (EOF-method) is applied. Both annual observations, smoothed (using Gauss filter) and seasonal values are analyzed. The analysis shows that the temporal variations in runoff and precipitation coincide. The deviations occurring in the seasonal values are caused by snow accumulation and snow melt. In the filtered series temporal trends are found. A comparison between the different normal periods has been carried out for precipitation. The 1900-30 and 1960-90 periods differ from the 1930-60 period. This may be caused by different weather types dominating the different periods. The different weather types are reflected in different empirical orthogonal functions. This is verified by regional studies. The coinciding patterns in runoff and precipitation are important aspects in climate studies and for extrapolation purposes. 11 refs., 20 figs., 1 tab.

  20. An Extension of the Mean Value Theorem for Integrals

    Science.gov (United States)

    Khalili, Parviz; Vasiliu, Daniel

    2010-01-01

    In this note we present an extension of the mean value theorem for integrals. The extension we consider is motivated by an older result (here referred as Corollary 2), which is quite classical for the literature of Mathematical Analysis or Calculus. We also show an interesting application for computing the sum of a harmonic series.

  1. Common pitfalls in statistical analysis: “P” values, statistical significance and confidence intervals

    Science.gov (United States)

    Ranganathan, Priya; Pramesh, C. S.; Buyse, Marc

    2015-01-01

    In the second part of a series on pitfalls in statistical analysis, we look at various ways in which a statistically significant study result can be expressed. We debunk some of the myths regarding the ‘P’ value, explain the importance of ‘confidence intervals’ and clarify the importance of including both values in a paper PMID:25878958

  2. Temporal relationships between awakening cortisol and psychosocial variables in inpatients with anorexia nervosa - A time series approach.

    Science.gov (United States)

    Wild, Beate; Stadnitski, Tatjana; Wesche, Daniela; Stroe-Kunold, Esther; Schultz, Jobst-Hendrik; Rudofsky, Gottfried; Maser-Gluth, Christiane; Herzog, Wolfgang; Friederich, Hans-Christoph

    2016-04-01

    The aim of the study was to investigate the characteristics of the awakening salivary cortisol in patients with anorexia nervosa (AN) using a time series design. We included ten AN inpatients, six with a very low BMI (high symptom severity, HSS group) and four patients with less severe symptoms (low symptom severity, LSS group). Patients collected salivary cortisol daily upon awakening. The number of collected saliva samples varied across patients between n=65 and n=229 (due to the different lengths of their inpatient stay). In addition, before retiring, the patients answered questions daily on the handheld regarding disorder-related psychosocial variables. The analysis of cortisol and diary data was conducted by using a time series approach. Time series showed that the awakening cortisol of the AN patients was elevated as compared to a control group. Cortisol measurements of patients with LSS essentially fluctuated in a stationary manner around a constant mean. The series of patients with HSS were generally less stable; four HSS patients showed a non-stationary cortisol awakening series. Antipsychotic medication did not change awakening cortisol in a specific way. The lagged dependencies between cortisol and depressive feelings became significant for four patients. Here, higher cortisol values were temporally associated with higher values of depressive feelings. Upon awakening, the cortisol of all AN patients was in the standard range but elevated as compared to healthy controls. Patients with HSS appeared to show less stable awakening cortisol time series compared to patients with LSS. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. The management of production value stream factors in a foundry

    Directory of Open Access Journals (Sweden)

    S. Borkowski

    2010-01-01

    Full Text Available Connection of two value streams: production and human resources were proposed as a new approach to the production process. To assess the factors of production value streams the elements of the top of the Toyota's house as well as fourth and sixth Toyota's managing principles were used. On the basis of the feedback from respondents –the foundry workers, there can be determined the validity of series of decisive factors' importance that equalizes the work load and requires the standardization.

  4. Study of the dependence of resolution temporal activity for a Philips gemini TF PET/CT scanner by applying a statistical analysis of time series; Estudio de la dependencia de la resolucion temporal con la actividad para un escaner PET-TAC philips gemini TF aplicando un analisis estadistico de series temporales

    Energy Technology Data Exchange (ETDEWEB)

    Sanchez Merino, G.; Cortes Rpdicio, J.; Lope Lope, R.; Martin Gonzalez, T.; Garcia Fidalgo, M. A.

    2013-07-01

    The aim of the present work is to study the dependence of temporal resolution with the activity using statistical techniques applied to the series of values time series measurements of temporal resolution during daily equipment checks. (Author)

  5. Extreme events in total ozone: Spatio-temporal analysis from local to global scale

    Science.gov (United States)

    Rieder, Harald E.; Staehelin, Johannes; Maeder, Jörg A.; Ribatet, Mathieu; di Rocco, Stefania; Jancso, Leonhardt M.; Peter, Thomas; Davison, Anthony C.

    2010-05-01

    dynamics (NAO, ENSO) on total ozone is a global feature in the northern mid-latitudes (Rieder et al., 2010c). In a next step frequency distributions of extreme events are analyzed on global scale (northern and southern mid-latitudes). A specific focus here is whether findings gained through analysis of long-term European ground based stations can be clearly identified as a global phenomenon. By showing results from these three types of studies an overview of extreme events in total ozone (and the dynamical and chemical features leading to those) will be presented from local to global scales. References: Coles, S.: An Introduction to Statistical Modeling of Extreme Values, Springer Series in Statistics, ISBN:1852334592, Springer, Berlin, 2001. Ribatet, M.: POT: Modelling peaks over a threshold, R News, 7, 34-36, 2007. Rieder, H.E., Staehelin, J., Maeder, J.A., Ribatet, M., Stübi, R., Weihs, P., Holawe, F., Peter, T., and A.D., Davison (2010): Extreme events in total ozone over Arosa - Part I: Application of extreme value theory, to be submitted to ACPD. Rieder, H.E., Staehelin, J., Maeder, J.A., Ribatet, M., Stübi, R., Weihs, P., Holawe, F., Peter, T., and A.D., Davison (2010): Extreme events in total ozone over Arosa - Part II: Fingerprints of atmospheric dynamics and chemistry and effects on mean values and long-term changes, to be submitted to ACPD. Rieder, H.E., Jancso, L., Staehelin, J., Maeder, J.A., Ribatet, Peter, T., and A.D., Davison (2010): Extreme events in total ozone over the northern mid-latitudes: A case study based on long-term data sets from 5 ground-based stations, in preparation. Staehelin, J., Renaud, A., Bader, J., McPeters, R., Viatte, P., Hoegger, B., Bugnion, V., Giroud, M., and Schill, H.: Total ozone series at Arosa (Switzerland): Homogenization and data comparison, J. Geophys. Res., 103(D5), 5827-5842, doi:10.1029/97JD02402, 1998a. Staehelin, J., Kegel, R., and Harris, N. R.: Trend analysis of the homogenized total ozone series of Arosa

  6. Bivariate extreme value with application to PM10 concentration analysis

    Science.gov (United States)

    Amin, Nor Azrita Mohd; Adam, Mohd Bakri; Ibrahim, Noor Akma; Aris, Ahmad Zaharin

    2015-05-01

    This study is focus on a bivariate extreme of renormalized componentwise maxima with generalized extreme value distribution as a marginal function. The limiting joint distribution of several parametric models are presented. Maximum likelihood estimation is employed for parameter estimations and the best model is selected based on the Akaike Information Criterion. The weekly and monthly componentwise maxima series are extracted from the original observations of daily maxima PM10 data for two air quality monitoring stations located in Pasir Gudang and Johor Bahru. The 10 years data are considered for both stations from year 2001 to 2010. The asymmetric negative logistic model is found as the best fit bivariate extreme model for both weekly and monthly maxima componentwise series. However the dependence parameters show that the variables for weekly maxima series is more dependence to each other compared to the monthly maxima.

  7. Modeling climate change impacts on combined sewer overflow using synthetic precipitation time series.

    Science.gov (United States)

    Bendel, David; Beck, Ferdinand; Dittmer, Ulrich

    2013-01-01

    In the presented study climate change impacts on combined sewer overflows (CSOs) in Baden-Wuerttemberg, Southern Germany, were assessed based on continuous long-term rainfall-runoff simulations. As input data, synthetic rainfall time series were used. The applied precipitation generator NiedSim-Klima accounts for climate change effects on precipitation patterns. Time series for the past (1961-1990) and future (2041-2050) were generated for various locations. Comparing the simulated CSO activity of both periods we observe significantly higher overflow frequencies for the future. Changes in overflow volume and overflow duration depend on the type of overflow structure. Both values will increase at simple CSO structures that merely divide the flow, whereas they will decrease when the CSO structure is combined with a storage tank. However, there is a wide variation between the results of different precipitation time series (representative for different locations).

  8. Impact of arachidonic versus eicosapentaenoic acid on exotonin-induced lung vascular leakage: relation to 4-series versus 5-series leukotriene generation.

    Science.gov (United States)

    Grimminger, F; Wahn, H; Mayer, K; Kiss, L; Walmrath, D; Seeger, W

    1997-02-01

    Escherichia coli hemolysin (HlyA) is a proteinaceous pore-forming exotoxin that is implicated as a significant pathogenicity factor in extraintestinal E. coli infections including sepsis. In perfused rabbit lungs, subcytolytic concentrations of the toxin evoke thromboxane-mediated vasoconstriction and prostanoid-independent protracted vascular permeability increase (11). In the present study, the influence of submicromolar concentrations of free arachidonic acid (AA) and eicosapentaenoic acid (EPA) on the HlyA-induced leakage response was investigated. HlyA at concentration from 0.02 to 0.06 hemolytic units/ml provoked a dose-dependent, severalfold increase in the capillary filtration coefficient (Kfc), accompanied by the release of leukotriene(LT)B4, LTC4, and LTE4 into the recirculating buffer fluid. Simultaneous application of 100 nmol/L AA markedly augmented the HlyA-elicited leakage response, concomitant with an amplification of LTB4 release and a change in the kinetics of cysteinyl-LT generation. In contrast, 50 to 200 nmol/L EPA suppressed in a dose-dependent manner the HlyA-induced increase in Kfc values. This was accompanied by a blockage of 4-series LT generation and a dose-dependent appearance of LTB5, LTC5, and LTE5. In addition, EPA fully antagonized the AA-induced amplification of the HlyA-provoked Kfc increase, again accompanied by a shift from 4-series to 5-series LT generation. We conclude that the vascular leakage provoked by HlyA in rabbit lungs is differentially influenced by free AA versus free EPA, related to the generation of 4- versus 5-series leukotrienes. The composition of lipid emulsions used for parenteral nutrition may thus influence inflammatory capillary leakage.

  9. Spectral Estimation of UV-Vis Absorbance Time Series for Water Quality Monitoring

    Directory of Open Access Journals (Sweden)

    Leonardo Plazas-Nossa

    2017-05-01

    Full Text Available Context: Signals recorded as multivariate time series by UV-Vis absorbance captors installed in urban sewer systems, can be non-stationary, yielding complications in the analysis of water quality monitoring. This work proposes to perform spectral estimation using the Box-Cox transformation and differentiation in order to obtain stationary multivariate time series in a wide sense. Additionally, Principal Component Analysis (PCA is applied to reduce their dimensionality. Method: Three different UV-Vis absorbance time series for different Colombian locations were studied: (i El-Salitre Wastewater Treatment Plant (WWTP in Bogotá; (ii Gibraltar Pumping Station (GPS in Bogotá; and (iii San-Fernando WWTP in Itagüí. Each UV-Vis absorbance time series had equal sample number (5705. The esti-mation of the spectral power density is obtained using the average of modified periodograms with rectangular window and an overlap of 50%, with the 20 most important harmonics from the Discrete Fourier Transform (DFT and Inverse Fast Fourier Transform (IFFT. Results: Absorbance time series dimensionality reduction using PCA, resulted in 6, 8 and 7 principal components for each study site respectively, altogether explaining more than 97% of their variability. Values of differences below 30% for the UV range were obtained for the three study sites, while for the visible range the maximum differences obtained were: (i 35% for El-Salitre WWTP; (ii 61% for GPS; and (iii 75% for San-Fernando WWTP. Conclusions: The Box-Cox transformation and the differentiation process applied to the UV-Vis absorbance time series for the study sites (El-Salitre, GPS and San-Fernando, allowed to reduce variance and to eliminate ten-dency of the time series. A pre-processing of UV-Vis absorbance time series is recommended to detect and remove outliers and then apply the proposed process for spectral estimation. Language: Spanish.

  10. A series solution for horizontal infiltration in an initially dry aquifer

    Science.gov (United States)

    Furtak-Cole, Eden; Telyakovskiy, Aleksey S.; Cooper, Clay A.

    2018-06-01

    The porous medium equation (PME) is a generalization of the traditional Boussinesq equation for hydraulic conductivity as a power law function of height. We analyze the horizontal recharge of an initially dry unconfined aquifer of semi-infinite extent, as would be found in an aquifer adjacent a rising river. If the water level can be modeled as a power law function of time, similarity variables can be introduced and the original problem can be reduced to a boundary value problem for a nonlinear ordinary differential equation. The position of the advancing front is not known ahead of time and must be found in the process of solution. We present an analytical solution in the form of a power series, with the coefficients of the series given by a recurrence relation. The analytical solution compares favorably with a highly accurate numerical solution, and only a small number of terms of the series are needed to achieve high accuracy in the scenarios considered here. We also conduct a series of physical experiments in an initially dry wedged Hele-Shaw cell, where flow is modeled by a special form of the PME. Our analytical solution closely matches the hydraulic head profiles in the Hele-Shaw cell experiment.

  11. Implementing Target Value Design.

    Science.gov (United States)

    Alves, Thais da C L; Lichtig, Will; Rybkowski, Zofia K

    2017-04-01

    An alternative to the traditional way of designing projects is the process of target value design (TVD), which takes different departure points to start the design process. The TVD process starts with the client defining an allowable cost that needs to be met by the design and construction teams. An expected cost in the TVD process is defined through multiple interactions between multiple stakeholders who define wishes and others who define ways of achieving these wishes. Finally, a target cost is defined based on the expected profit the design and construction teams are expecting to make. TVD follows a series of continuous improvement efforts aimed at reaching the desired goals for the project and its associated target value cost. The process takes advantage of rapid cycles of suggestions, analyses, and implementation that starts with the definition of value for the client. In the traditional design process, the goal is to identify user preferences and find solutions that meet the needs of the client's expressed preferences. In the lean design process, the goal is to educate users about their values and advocate for a better facility over the long run; this way owners can help contractors and designers to identify better solutions. This article aims to inform the healthcare community about tools and techniques commonly used during the TVD process and how they can be used to educate and support project participants in developing better solutions to meet their needs now as well as in the future.

  12. Scaling properties of Polish rain series

    Science.gov (United States)

    Licznar, P.

    2009-04-01

    implementation of double trace moment method allowed for estimation of local universal multifractal rainfall parameters (α=0.69; C1=0.34; H=-0.01). The research proved the fractal character of rainfall process support and multifractal character of the rainfall intensity values variability among analyzed time series. It is believed that scaling of local Wroclaw's rainfalls for timescales at the range from 24 hours up to 5 minutes opens the door for future research concerning for example random cascades implementation for daily precipitation totals disaggregation for smaller time intervals. The results of such a random cascades functioning in a form of 5 minute artificial rainfall scenarios could be of great practical usability for needs of urban hydrology, and design and hydrodynamic modeling of storm water and combined sewage conveyance systems.

  13. Robust and Adaptive Online Time Series Prediction with Long Short-Term Memory

    Directory of Open Access Journals (Sweden)

    Haimin Yang

    2017-01-01

    Full Text Available Online time series prediction is the mainstream method in a wide range of fields, ranging from speech analysis and noise cancelation to stock market analysis. However, the data often contains many outliers with the increasing length of time series in real world. These outliers can mislead the learned model if treated as normal points in the process of prediction. To address this issue, in this paper, we propose a robust and adaptive online gradient learning method, RoAdam (Robust Adam, for long short-term memory (LSTM to predict time series with outliers. This method tunes the learning rate of the stochastic gradient algorithm adaptively in the process of prediction, which reduces the adverse effect of outliers. It tracks the relative prediction error of the loss function with a weighted average through modifying Adam, a popular stochastic gradient method algorithm for training deep neural networks. In our algorithm, the large value of the relative prediction error corresponds to a small learning rate, and vice versa. The experiments on both synthetic data and real time series show that our method achieves better performance compared to the existing methods based on LSTM.

  14. Robust and Adaptive Online Time Series Prediction with Long Short-Term Memory.

    Science.gov (United States)

    Yang, Haimin; Pan, Zhisong; Tao, Qing

    2017-01-01

    Online time series prediction is the mainstream method in a wide range of fields, ranging from speech analysis and noise cancelation to stock market analysis. However, the data often contains many outliers with the increasing length of time series in real world. These outliers can mislead the learned model if treated as normal points in the process of prediction. To address this issue, in this paper, we propose a robust and adaptive online gradient learning method, RoAdam (Robust Adam), for long short-term memory (LSTM) to predict time series with outliers. This method tunes the learning rate of the stochastic gradient algorithm adaptively in the process of prediction, which reduces the adverse effect of outliers. It tracks the relative prediction error of the loss function with a weighted average through modifying Adam, a popular stochastic gradient method algorithm for training deep neural networks. In our algorithm, the large value of the relative prediction error corresponds to a small learning rate, and vice versa. The experiments on both synthetic data and real time series show that our method achieves better performance compared to the existing methods based on LSTM.

  15. Avoid Filling Swiss Cheese with Whipped Cream; Imputation Techniques and Evaluation Procedures for Cross-Country Time Series

    OpenAIRE

    Michael Weber; Michaela Denk

    2011-01-01

    International organizations collect data from national authorities to create multivariate cross-sectional time series for their analyses. As data from countries with not yet well-established statistical systems may be incomplete, the bridging of data gaps is a crucial challenge. This paper investigates data structures and missing data patterns in the cross-sectional time series framework, reviews missing value imputation techniques used for micro data in official statistics, and discusses the...

  16. Spectral Unmixing Analysis of Time Series Landsat 8 Images

    Science.gov (United States)

    Zhuo, R.; Xu, L.; Peng, J.; Chen, Y.

    2018-05-01

    Temporal analysis of Landsat 8 images opens up new opportunities in the unmixing procedure. Although spectral analysis of time series Landsat imagery has its own advantage, it has rarely been studied. Nevertheless, using the temporal information can provide improved unmixing performance when compared to independent image analyses. Moreover, different land cover types may demonstrate different temporal patterns, which can aid the discrimination of different natures. Therefore, this letter presents time series K-P-Means, a new solution to the problem of unmixing time series Landsat imagery. The proposed approach is to obtain the "purified" pixels in order to achieve optimal unmixing performance. The vertex component analysis (VCA) is used to extract endmembers for endmember initialization. First, nonnegative least square (NNLS) is used to estimate abundance maps by using the endmember. Then, the estimated endmember is the mean value of "purified" pixels, which is the residual of the mixed pixel after excluding the contribution of all nondominant endmembers. Assembling two main steps (abundance estimation and endmember update) into the iterative optimization framework generates the complete algorithm. Experiments using both simulated and real Landsat 8 images show that the proposed "joint unmixing" approach provides more accurate endmember and abundance estimation results compared with "separate unmixing" approach.

  17. Green Decision Making: How Systemic Planning can support Strategic Decision Making for Sustainable Transport Development

    DEFF Research Database (Denmark)

    Leleur, Steen

    for Strategic Management. The book was published in 2012 by Springer-Verlag, London, as a research monograph in the publisher’s series about Decision Engineering. The intention behind this new book – with its focus upon ‘greening’ of strategic decisions – is to provide a general and less technical description......The book is based on my participation in the SUSTAIN research project 2012-2017 about National Sustainable Transport Planning funded by the Danish Research Council (Innovationsfonden). Many of the issues treated here have a backdrop in my book Complex Strategic Choices – Applying Systemic Planning...... to this application area. In fact a company relocation decision case has been used to introduce the potential of SP as regards providing decision support for strategic decision making. A main concern in this presentation of SP, which deviates from the Springer book referred to above, is to highlight that ‘greening...

  18. Measurement of radionuclide activities of uranium-238 series in soil samples by gamma spectrometry: case of Vinaninkarena

    International Nuclear Information System (INIS)

    Randrianantenaina, F.R.

    2017-01-01

    The aim of this work is to determine the activity level of radionuclides of uranium-238 series. Eight soil samples are collected at Rural Commune of Vinaninkarena. After obtaining secular equilibrium, these samples have been measured using gamma spectrometry system in the Nuclear Analyses and Techniques Department of INSTN-Madagascar, with HPGe detector (30 % relative efficiency) and a Genie 2000 software. Activities obtained vary from (78±2)Bq.kg -1 to (49 231 ± 415)Bq.kg -1 . Among these eight samples, three activity levels are shown. Low activity is an activity which has value lower or equal to (89±3)Bq.kg -1 . Average activity is an activity which has value between (186± 1)Bq.kg -1 and (1049 ±7)Bq.kg -1 . And high activity is an activity which has value higher or equal to (14501±209)Bq.kg -1 . According to UNSCEAR 2000, these value are all higher than the world average value which is 35 Bq.kg -1 .It is due to the localities of sampling points. The variation of the activity level depends on radionuclide concentration of uranium-238 series in the soil. [fr

  19. The Use of Sentinel-1 Time-Series Data to Improve Flood Monitoring in Arid Areas

    Directory of Open Access Journals (Sweden)

    Sandro Martinis

    2018-04-01

    Full Text Available Due to the similarity of the radar backscatter over open water and over sand surfaces a reliable near real-time flood mapping based on satellite radar sensors is usually not possible in arid areas. Within this study, an approach is presented to enhance the results of an automatic Sentinel-1 flood processing chain by removing overestimations of the water extent related to low-backscattering sand surfaces using a Sand Exclusion Layer (SEL derived from time-series statistics of Sentinel-1 data sets. The methodology was tested and validated on a flood event in May 2016 at Webi Shabelle River, Somalia and Ethiopia, which has been covered by a time-series of 202 Sentinel-1 scenes within the period June 2014 to May 2017. The approach proved capable of significantly improving the classification accuracy of the Sentinel-1 flood service within this study site. The Overall Accuracy increased by ~5% to a value of 98.5% and the User’s Accuracy increased by 25.2% to a value of 96.0%. Experimental results have shown that the classification accuracy is influenced by several parameters such as the lengths of the time-series used for generating the SEL.

  20. Deriving crop calendar using NDVI time-series

    Science.gov (United States)

    Patel, J. H.; Oza, M. P.

    2014-11-01

    Agricultural intensification is defined in terms as cropping intensity, which is the numbers of crops (single, double and triple) per year in a unit cropland area. Information about crop calendar (i.e. number of crops in a parcel of land and their planting & harvesting dates and date of peak vegetative stage) is essential for proper management of agriculture. Remote sensing sensors provide a regular, consistent and reliable measurement of vegetation response at various growth stages of crop. Therefore it is ideally suited for monitoring purpose. The spectral response of vegetation, as measured by the Normalized Difference Vegetation Index (NDVI) and its profiles, can provide a new dimension for describing vegetation growth cycle. The analysis based on values of NDVI at regular time interval provides useful information about various crop growth stages and performance of crop in a season. However, the NDVI data series has considerable amount of local fluctuation in time domain and needs to be smoothed so that dominant seasonal behavior is enhanced. Based on temporal analysis of smoothed NDVI series, it is possible to extract number of crop cycles per year and their crop calendar. In the present study, a methodology is developed to extract key elements of crop growth cycle (i.e. number of crops per year and their planting - peak - harvesting dates). This is illustrated by analysing MODIS-NDVI data series of one agricultural year (from June 2012 to May 2013) over Gujarat. Such an analysis is very useful for analysing dynamics of kharif and rabi crops.

  1. Rheumatoid Arthritis Educational Video Series

    Medline Plus

    Full Text Available ... Patient Webcasts / Rheumatoid Arthritis Educational Video Series Rheumatoid Arthritis Educational Video Series This series of five videos ... member of our patient care team. Managing Your Arthritis Managing Your Arthritis Managing Chronic Pain and Depression ...

  2. Rheumatoid Arthritis Educational Video Series

    Science.gov (United States)

    ... Corner / Patient Webcasts / Rheumatoid Arthritis Educational Video Series Rheumatoid Arthritis Educational Video Series This series of five videos ... Your Arthritis Managing Chronic Pain and Depression in Arthritis Nutrition & Rheumatoid Arthritis Arthritis and Health-related Quality of Life ...

  3. Multivariate stochastic analysis for Monthly hydrological time series at Cuyahoga River Basin

    Science.gov (United States)

    zhang, L.

    2011-12-01

    Copula has become a very powerful statistic and stochastic methodology in case of the multivariate analysis in Environmental and Water resources Engineering. In recent years, the popular one-parameter Archimedean copulas, e.g. Gumbel-Houggard copula, Cook-Johnson copula, Frank copula, the meta-elliptical copula, e.g. Gaussian Copula, Student-T copula, etc. have been applied in multivariate hydrological analyses, e.g. multivariate rainfall (rainfall intensity, duration and depth), flood (peak discharge, duration and volume), and drought analyses (drought length, mean and minimum SPI values, and drought mean areal extent). Copula has also been applied in the flood frequency analysis at the confluences of river systems by taking into account the dependence among upstream gauge stations rather than by using the hydrological routing technique. In most of the studies above, the annual time series have been considered as stationary signal which the time series have been assumed as independent identically distributed (i.i.d.) random variables. But in reality, hydrological time series, especially the daily and monthly hydrological time series, cannot be considered as i.i.d. random variables due to the periodicity existed in the data structure. Also, the stationary assumption is also under question due to the Climate Change and Land Use and Land Cover (LULC) change in the fast years. To this end, it is necessary to revaluate the classic approach for the study of hydrological time series by relaxing the stationary assumption by the use of nonstationary approach. Also as to the study of the dependence structure for the hydrological time series, the assumption of same type of univariate distribution also needs to be relaxed by adopting the copula theory. In this paper, the univariate monthly hydrological time series will be studied through the nonstationary time series analysis approach. The dependence structure of the multivariate monthly hydrological time series will be

  4. From Networks to Time Series

    Science.gov (United States)

    Shimada, Yutaka; Ikeguchi, Tohru; Shigehara, Takaomi

    2012-10-01

    In this Letter, we propose a framework to transform a complex network to a time series. The transformation from complex networks to time series is realized by the classical multidimensional scaling. Applying the transformation method to a model proposed by Watts and Strogatz [Nature (London) 393, 440 (1998)], we show that ring lattices are transformed to periodic time series, small-world networks to noisy periodic time series, and random networks to random time series. We also show that these relationships are analytically held by using the circulant-matrix theory and the perturbation theory of linear operators. The results are generalized to several high-dimensional lattices.

  5. From divergent power series to analytic functions theory and application of multisummable power series

    CERN Document Server

    Balser, Werner

    1994-01-01

    Multisummability is a method which, for certain formal power series with radius of convergence equal to zero, produces an analytic function having the formal series as its asymptotic expansion. This book presents the theory of multisummabi- lity, and as an application, contains a proof of the fact that all formal power series solutions of non-linear meromorphic ODE are multisummable. It will be of use to graduate students and researchers in mathematics and theoretical physics, and especially to those who encounter formal power series to (physical) equations with rapidly, but regularly, growing coefficients.

  6. Computation of covex bounds for present value functions with random payments

    NARCIS (Netherlands)

    Ahcan, A.; Darkiewicz, G.; Goovaerts, M.J.; Hoedemakers, T.

    2006-01-01

    In this contribution we study the distribution of the present value function of a series of random payments in a stochastic financial environment. Such distributions occur naturally in a wide range of applications within fields of insurance and finance. We obtain accurate approximations by

  7. Prognostic value of nucleolar size and size pleomorphism in choroidal melanomas

    DEFF Research Database (Denmark)

    Sørensen, Flemming Brandt; Gamel, J W; Jensen, O A

    1993-01-01

    Morphometric estimates of nucleolar size have been shown to possess a high prognostic value in patients with uveal melanomas. The authors investigated various quantitative estimators of the mean size and pleomorphism of nucleoli in choroidal melanomas from a consecutive series of 95 Danish patien...

  8. Rheumatoid Arthritis Educational Video Series

    Medline Plus

    Full Text Available ... Corner / Patient Webcasts / Rheumatoid Arthritis Educational Video Series Rheumatoid Arthritis Educational Video Series This series of five videos ... Your Arthritis Managing Chronic Pain and Depression in Arthritis Nutrition & Rheumatoid Arthritis Arthritis and Health-related Quality of Life ...

  9. Duality between Time Series and Networks

    Science.gov (United States)

    Campanharo, Andriana S. L. O.; Sirer, M. Irmak; Malmgren, R. Dean; Ramos, Fernando M.; Amaral, Luís A. Nunes.

    2011-01-01

    Studying the interaction between a system's components and the temporal evolution of the system are two common ways to uncover and characterize its internal workings. Recently, several maps from a time series to a network have been proposed with the intent of using network metrics to characterize time series. Although these maps demonstrate that different time series result in networks with distinct topological properties, it remains unclear how these topological properties relate to the original time series. Here, we propose a map from a time series to a network with an approximate inverse operation, making it possible to use network statistics to characterize time series and time series statistics to characterize networks. As a proof of concept, we generate an ensemble of time series ranging from periodic to random and confirm that application of the proposed map retains much of the information encoded in the original time series (or networks) after application of the map (or its inverse). Our results suggest that network analysis can be used to distinguish different dynamic regimes in time series and, perhaps more importantly, time series analysis can provide a powerful set of tools that augment the traditional network analysis toolkit to quantify networks in new and useful ways. PMID:21858093

  10. Physiological and Biochemical Characteristics in Flag Leaves of the C Liangyou Series of Hybrid Rice Combinations at Late Growth Stages

    Directory of Open Access Journals (Sweden)

    Wen-bang TANG

    2010-12-01

    Full Text Available The contents of chlorophyll, soluble sugars, soluble proteins and thiobarbituric acid reaction substance (TBARS, chlorophyll fluorescence parameters, net photosynthetic rate as well as the activities of superoxide dismutase (SOD and peroxidase (POD of flag leaves at the late growth stages were studied by using C Liangyou series of hybrid rice combinitions as material and Shanyou 63 as control. The C Liangyou series of hybrid rice combinations used in the experiment included C Liangyou 396, C Liangyou 87, C Liangyou 755 and C Liangyou 34, which all used C815S as male sterile line. The contents of chlorophyll, soluble sugars and soluble proteins in flag leaves of the C Liangyou series combinations at the late growth stages were higher than those of the control, whereas the TBARS content was lower than that of the control. The activities of SOD and POD were significantly higher than those of the control on the 7th day after heading, and then decreased slowly. FPSII value and qP value of flag leaves decreased at the late growth stages, and these two parameters in flag leaves of the C Liangyou series combinations were higher than those of the control, while the qN value increased at the late growth stages and was lower than that of the control. The net photosynthetic rate of flag leaves at the late growth stage was higher compared with the control. These results suggest that slow senescence and strong photosynthetic capability in flag leaves at the late growth stages are the physiological basis of the C Liangyou series combinations.

  11. Series Transmission Line Transformer

    Science.gov (United States)

    Buckles, Robert A.; Booth, Rex; Yen, Boris T.

    2004-06-29

    A series transmission line transformer is set forth which includes two or more of impedance matched sets of at least two transmissions lines such as shielded cables, connected in parallel at one end ans series at the other in a cascading fashion. The cables are wound about a magnetic core. The series transmission line transformer (STLT) which can provide for higher impedance ratios and bandwidths, which is scalable, and which is of simpler design and construction.

  12. Series expansions without diagrams

    International Nuclear Information System (INIS)

    Bhanot, G.; Creutz, M.; Horvath, I.; Lacki, J.; Weckel, J.

    1994-01-01

    We discuss the use of recursive enumeration schemes to obtain low- and high-temperature series expansions for discrete statistical systems. Using linear combinations of generalized helical lattices, the method is competitive with diagrammatic approaches and is easily generalizable. We illustrate the approach using Ising and Potts models. We present low-temperature series results in up to five dimensions and high-temperature series in three dimensions. The method is general and can be applied to any discrete model

  13. Modelling of extreme rainfall events in Peninsular Malaysia based on annual maximum and partial duration series

    Science.gov (United States)

    Zin, Wan Zawiah Wan; Shinyie, Wendy Ling; Jemain, Abdul Aziz

    2015-02-01

    In this study, two series of data for extreme rainfall events are generated based on Annual Maximum and Partial Duration Methods, derived from 102 rain-gauge stations in Peninsular from 1982-2012. To determine the optimal threshold for each station, several requirements must be satisfied and Adapted Hill estimator is employed for this purpose. A semi-parametric bootstrap is then used to estimate the mean square error (MSE) of the estimator at each threshold and the optimal threshold is selected based on the smallest MSE. The mean annual frequency is also checked to ensure that it lies in the range of one to five and the resulting data is also de-clustered to ensure independence. The two data series are then fitted to Generalized Extreme Value and Generalized Pareto distributions for annual maximum and partial duration series, respectively. The parameter estimation methods used are the Maximum Likelihood and the L-moment methods. Two goodness of fit tests are then used to evaluate the best-fitted distribution. The results showed that the Partial Duration series with Generalized Pareto distribution and Maximum Likelihood parameter estimation provides the best representation for extreme rainfall events in Peninsular Malaysia for majority of the stations studied. Based on these findings, several return values are also derived and spatial mapping are constructed to identify the distribution characteristic of extreme rainfall in Peninsular Malaysia.

  14. Statistical analysis of yearly series of maximum daily rainfall in Spain. Analisis estadistico de las series anuales de maximas lluvias diarias en Espaa

    Energy Technology Data Exchange (ETDEWEB)

    Ferrer Polo, J.; Ardiles Lopez, K. L. (CEDEX, Ministerio de Obras Publicas, Transportes y Medio ambiente, Madrid (Spain))

    1994-01-01

    Work on the statistical modelling of maximum daily rainfalls is presented, with a view to estimating the quantiles for different return periods. An index flood approach has been adopted in which the local quantiles are a result of rescaling a regional law using the mean of each series of values, that is utilized as a local scale factor. The annual maximum series have been taken from 1.545 meteorological stations over a 30 year period, and these have been classified into 26 regions defined according to meteorological criteria, the homogeneity of wich has been checked by means of a statistical analysis of the coefficients of variation of the samples,using the. An estimation has been made of the parameters for the following four distribution models: Two Component Extreme Value (TCEV); General Extreme Value (GEV); Log-Pearson III (LP3); and SQRT-Exponential Type Distribution of Maximum. The analysis of the quantiles obtained reveals slight differences in the results thus detracting from the importance of the model selection. The last of the above-mentioned distribution has been finally chosen, on the basis of the following: it is defined with fewer parameters it is the only that was proposed specifically for the analysis of daily rainfall maximums; it yields more conservative results than the traditional Gumbel distribution for the high return periods; and it is capable of providing a good description of the main sampling statistics concerning the right-hand tail of the distribution, a fact that has been checked with Montecarlo's simulation techniques. The choice of a distribution model with only two parameters has led to the selection of the regional coefficient of variation as the only determining parameter for the regional quantiles. This has permitted the elimination of the quantiles discontinuity of the classical regional approach, thus smoothing the values of that coefficient by means of an isoline plan on a national scale.

  15. On the Use of Running Trends as Summary Statistics for Univariate Time Series and Time Series Association

    OpenAIRE

    Trottini, Mario; Vigo, Isabel; Belda, Santiago

    2015-01-01

    Given a time series, running trends analysis (RTA) involves evaluating least squares trends over overlapping time windows of L consecutive time points, with overlap by all but one observation. This produces a new series called the “running trends series,” which is used as summary statistics of the original series for further analysis. In recent years, RTA has been widely used in climate applied research as summary statistics for time series and time series association. There is no doubt that ...

  16. A Course in Time Series Analysis

    CERN Document Server

    Peña, Daniel; Tsay, Ruey S

    2011-01-01

    New statistical methods and future directions of research in time series A Course in Time Series Analysis demonstrates how to build time series models for univariate and multivariate time series data. It brings together material previously available only in the professional literature and presents a unified view of the most advanced procedures available for time series model building. The authors begin with basic concepts in univariate time series, providing an up-to-date presentation of ARIMA models, including the Kalman filter, outlier analysis, automatic methods for building ARIMA models, a

  17. Topological data analysis of financial time series: Landscapes of crashes

    Science.gov (United States)

    Gidea, Marian; Katz, Yuri

    2018-02-01

    We explore the evolution of daily returns of four major US stock market indices during the technology crash of 2000, and the financial crisis of 2007-2009. Our methodology is based on topological data analysis (TDA). We use persistence homology to detect and quantify topological patterns that appear in multidimensional time series. Using a sliding window, we extract time-dependent point cloud data sets, to which we associate a topological space. We detect transient loops that appear in this space, and we measure their persistence. This is encoded in real-valued functions referred to as a 'persistence landscapes'. We quantify the temporal changes in persistence landscapes via their Lp-norms. We test this procedure on multidimensional time series generated by various non-linear and non-equilibrium models. We find that, in the vicinity of financial meltdowns, the Lp-norms exhibit strong growth prior to the primary peak, which ascends during a crash. Remarkably, the average spectral density at low frequencies of the time series of Lp-norms of the persistence landscapes demonstrates a strong rising trend for 250 trading days prior to either dotcom crash on 03/10/2000, or to the Lehman bankruptcy on 09/15/2008. Our study suggests that TDA provides a new type of econometric analysis, which complements the standard statistical measures. The method can be used to detect early warning signals of imminent market crashes. We believe that this approach can be used beyond the analysis of financial time series presented here.

  18. Using lagged dependence to identify (de)coupled surface and subsurface soil moisture values

    Science.gov (United States)

    Carranza, Coleen D. U.; van der Ploeg, Martine J.; Torfs, Paul J. J. F.

    2018-04-01

    Recent advances in radar remote sensing popularized the mapping of surface soil moisture at different spatial scales. Surface soil moisture measurements are used in combination with hydrological models to determine subsurface soil moisture values. However, variability of soil moisture across the soil column is important for estimating depth-integrated values, as decoupling between surface and subsurface can occur. In this study, we employ new methods to investigate the occurrence of (de)coupling between surface and subsurface soil moisture. Using time series datasets, lagged dependence was incorporated in assessing (de)coupling with the idea that surface soil moisture conditions will be reflected at the subsurface after a certain delay. The main approach involves the application of a distributed-lag nonlinear model (DLNM) to simultaneously represent both the functional relation and the lag structure in the time series. The results of an exploratory analysis using residuals from a fitted loess function serve as a posteriori information to determine (de)coupled values. Both methods allow for a range of (de)coupled soil moisture values to be quantified. Results provide new insights into the decoupled range as its occurrence among the sites investigated is not limited to dry conditions.

  19. Study of the dependence of resolution temporal activity for a Philips gemini TF PET/CT scanner by applying a statistical analysis of time series

    International Nuclear Information System (INIS)

    Sanchez Merino, G.; Cortes Rpdicio, J.; Lope Lope, R.; Martin Gonzalez, T.; Garcia Fidalgo, M. A.

    2013-01-01

    The aim of the present work is to study the dependence of temporal resolution with the activity using statistical techniques applied to the series of values time series measurements of temporal resolution during daily equipment checks. (Author)

  20. An exact power series formula of the outage probability with noise and interference over generalized fading channels

    KAUST Repository

    Rached, Nadhir B.

    2016-12-24

    In this paper, we develop a generalized momentbased approach for the evaluation of the outage probability (OP) in the presence of co-channel interference and additive white Gaussian noise. The proposed method allows the evaluation of the OP of the signal-to-interference-plus-noise ratio by a power series expansion in the threshold value. Its main advantage is that it does not require a particular distribution for the interference channels. The only necessary ingredients are a power series expansion for the cumulative distribution function of the desired user power and the cross-moments of the interferers\\' powers. These requirements are easily met in many practical fading models, for which the OP might not be obtained in closed-form expression. For a sake of illustration, we consider the application of our method to the Rician fading environment. Under this setting, we carry out a convergence study of the proposed power series and corroborate the validity of our method for different values of fading parameters and various numbers of co-channel interferers.

  1. Fractal dynamics of heartbeat time series of young persons with metabolic syndrome

    Science.gov (United States)

    Muñoz-Diosdado, A.; Alonso-Martínez, A.; Ramírez-Hernández, L.; Martínez-Hernández, G.

    2012-10-01

    Many physiological systems have been in recent years quantitatively characterized using fractal analysis. We applied it to study heart variability of young subjects with metabolic syndrome (MS); we examined the RR time series (time between two R waves in ECG) with the detrended fluctuation analysis (DFA) method, the Higuchi's fractal dimension method and the multifractal analysis to detect the possible presence of heart problems. The results show that although the young persons have MS, the majority do not present alterations in the heart dynamics. However, there were cases where the fractal parameter values differed significantly from the healthy people values.

  2. Local and global recoding methods for anonymizing set-valued data

    KAUST Repository

    Terrovitis, Manolis

    2010-06-10

    In this paper, we study the problem of protecting privacy in the publication of set-valued data. Consider a collection of supermarket transactions that contains detailed information about items bought together by individuals. Even after removing all personal characteristics of the buyer, which can serve as links to his identity, the publication of such data is still subject to privacy attacks from adversaries who have partial knowledge about the set. Unlike most previous works, we do not distinguish data as sensitive and non-sensitive, but we consider them both as potential quasi-identifiers and potential sensitive data, depending on the knowledge of the adversary. We define a new version of the k-anonymity guarantee, the k m-anonymity, to limit the effects of the data dimensionality, and we propose efficient algorithms to transform the database. Our anonymization model relies on generalization instead of suppression, which is the most common practice in related works on such data. We develop an algorithm that finds the optimal solution, however, at a high cost that makes it inapplicable for large, realistic problems. Then, we propose a greedy heuristic, which performs generalizations in an Apriori, level-wise fashion. The heuristic scales much better and in most of the cases finds a solution close to the optimal. Finally, we investigate the application of techniques that partition the database and perform anonymization locally, aiming at the reduction of the memory consumption and further scalability. A thorough experimental evaluation with real datasets shows that a vertical partitioning approach achieves excellent results in practice. © 2010 Springer-Verlag.

  3. Review of "Biomedical Informatics; Computer Applications in Health Care and Biomedicine" by Edward H. Shortliffe and James J. Cimino

    OpenAIRE

    Clifford Gari D

    2006-01-01

    Abstract This article is an invited review of the third edition of "Biomedical Informatics; Computer Applications in Health Care and Biomedicine", one of thirty-six volumes in Springer's 'Health Informatics Series', edited by E. Shortliffe and J. Cimino. This book spans most of the current methods and issues in health informatics, ranging through subjects as varied as data acquisition and storage, standards, natural language processing, imaging, electronic health records, decision support, te...

  4. Gases and carbon in metals (thermodynamics, kinetics, and properties). Pt. 10

    International Nuclear Information System (INIS)

    Jehn, H.; Speck, H.; Fromm, E.; Hoerz, G.

    1980-01-01

    This issue is part of a series of data on Gases and Carbon in Metals which supplements the data compilation in the book Gase und Kohlenstoff in Metallen (Gases and Carbon in Metals), edited by E. Fromm and E. Gebhardt, Springer-Verlag, Berlin 1976. The present survey covers chromium and tungsten, includes results from papers published after the copy deadline and recommends critically selected data. Furthermore it comprises a bibliography of relevant literature. (GE) [de

  5. Monitoring cotton root rot by synthetic Sentinel-2 NDVI time series using improved spatial and temporal data fusion.

    Science.gov (United States)

    Wu, Mingquan; Yang, Chenghai; Song, Xiaoyu; Hoffmann, Wesley Clint; Huang, Wenjiang; Niu, Zheng; Wang, Changyao; Li, Wang; Yu, Bo

    2018-01-31

    To better understand the progression of cotton root rot within the season, time series monitoring is required. In this study, an improved spatial and temporal data fusion approach (ISTDFA) was employed to combine 250-m Moderate Resolution Imaging Spectroradiometer (MODIS) Normalized Different Vegetation Index (NDVI) and 10-m Sentinetl-2 NDVI data to generate a synthetic Sentinel-2 NDVI time series for monitoring this disease. Then, the phenology of healthy cotton and infected cotton was modeled using a logistic model. Finally, several phenology parameters, including the onset day of greenness minimum (OGM), growing season length (GLS), onset of greenness increase (OGI), max NDVI value, and integral area of the phenology curve, were calculated. The results showed that ISTDFA could be used to combine time series MODIS and Sentinel-2 NDVI data with a correlation coefficient of 0.893. The logistic model could describe the phenology curves with R-squared values from 0.791 to 0.969. Moreover, the phenology curve of infected cotton showed a significant difference from that of healthy cotton. The max NDVI value, OGM, GSL and the integral area of the phenology curve for infected cotton were reduced by 0.045, 30 days, 22 days, and 18.54%, respectively, compared with those for healthy cotton.

  6. Non-invasive breast biopsy method using GD-DTPA contrast enhanced MRI series and F-18-FDG PET/CT dynamic image series

    Science.gov (United States)

    Magri, Alphonso William

    algorithm. The best-fit parameters were used to create 3D parametric images. Compartmental modeling evaluation was based on the ability of parameter values to differentiate between tissue types. This evaluation was used on registered and unregistered image series and found that registration improved results. (5) PET and MR parametric images were registered through FEM- and FFD-based registration. Parametric image registration was evaluated using similarity measurements, target registration error, and qualitative comparison. Comparing FFD and FEM-based registration results showed that the FEM method is superior. This five-step process constitutes a novel multifaceted approach to a nonsurgical breast biopsy that successfully executes each step. Comparison of this method to biopsy still needs to be done with a larger set of subject data.

  7. Financial Time Series Prediction Using Elman Recurrent Random Neural Networks

    Directory of Open Access Journals (Sweden)

    Jie Wang

    2016-01-01

    (ERNN, the empirical results show that the proposed neural network displays the best performance among these neural networks in financial time series forecasting. Further, the empirical research is performed in testing the predictive effects of SSE, TWSE, KOSPI, and Nikkei225 with the established model, and the corresponding statistical comparisons of the above market indices are also exhibited. The experimental results show that this approach gives good performance in predicting the values from the stock market indices.

  8. Extreme events in total ozone over Arosa – Part 1: Application of extreme value theory

    Directory of Open Access Journals (Sweden)

    H. E. Rieder

    2010-10-01

    Full Text Available In this study ideas from extreme value theory are for the first time applied in the field of stratospheric ozone research, because statistical analysis showed that previously used concepts assuming a Gaussian distribution (e.g. fixed deviations from mean values of total ozone data do not adequately address the structure of the extremes. We show that statistical extreme value methods are appropriate to identify ozone extremes and to describe the tails of the Arosa (Switzerland total ozone time series. In order to accommodate the seasonal cycle in total ozone, a daily moving threshold was determined and used, with tools from extreme value theory, to analyse the frequency of days with extreme low (termed ELOs and high (termed EHOs total ozone at Arosa. The analysis shows that the Generalized Pareto Distribution (GPD provides an appropriate model for the frequency distribution of total ozone above or below a mathematically well-defined threshold, thus providing a statistical description of ELOs and EHOs. The results show an increase in ELOs and a decrease in EHOs during the last decades. The fitted model represents the tails of the total ozone data set with high accuracy over the entire range (including absolute monthly minima and maxima, and enables a precise computation of the frequency distribution of ozone mini-holes (using constant thresholds. Analyzing the tails instead of a small fraction of days below constant thresholds provides deeper insight into the time series properties. Fingerprints of dynamical (e.g. ENSO, NAO and chemical features (e.g. strong polar vortex ozone loss, and major volcanic eruptions, can be identified in the observed frequency of extreme events throughout the time series. Overall the new approach to analysis of extremes provides more information on time series properties and variability than previous approaches that use only monthly averages and/or mini-holes and mini-highs.

  9. Estimation of Airborne Lidar-Derived Tropical Forest Canopy Height Using Landsat Time Series in Cambodia

    Directory of Open Access Journals (Sweden)

    Tetsuji Ota

    2014-11-01

    Full Text Available In this study, we test and demonstrate the utility of disturbance and recovery information derived from annual Landsat time series to predict current forest vertical structure (as compared to the more common approaches, that consider a sample of airborne Lidar and single-date Landsat derived variables. Mean Canopy Height (MCH was estimated separately using single date, time series, and the combination of single date and time series variables in multiple regression and random forest (RF models. The combination of single date and time series variables, which integrate disturbance history over the entire time series, overall provided better MCH prediction than using either of the two sets of variables separately. In general, the RF models resulted in improved performance in all estimates over those using multiple regression. The lowest validation error was obtained using Landsat time series variables in a RF model (R2 = 0.75 and RMSE = 2.81 m. Combining single date and time series data was more effective when the RF model was used (opposed to multiple regression. The RMSE for RF mean canopy height prediction was reduced by 13.5% when combining the two sets of variables as compared to the 3.6% RMSE decline presented by multiple regression. This study demonstrates the value of airborne Lidar and long term Landsat observations to generate estimates of forest canopy height using the random forest algorithm.

  10. Normalization of time-series satellite reflectance data to a standard sun-target-sensor geometry using a semi-empirical model

    Science.gov (United States)

    Zhao, Yongguang; Li, Chuanrong; Ma, Lingling; Tang, Lingli; Wang, Ning; Zhou, Chuncheng; Qian, Yonggang

    2017-10-01

    Time series of satellite reflectance data have been widely used to characterize environmental phenomena, describe trends in vegetation dynamics and study climate change. However, several sensors with wide spatial coverage and high observation frequency are usually designed to have large field of view (FOV), which cause variations in the sun-targetsensor geometry in time-series reflectance data. In this study, on the basis of semiempirical kernel-driven BRDF model, a new semi-empirical model was proposed to normalize the sun-target-sensor geometry of remote sensing image. To evaluate the proposed model, bidirectional reflectance under different canopy growth conditions simulated by Discrete Anisotropic Radiative Transfer (DART) model were used. The semi-empirical model was first fitted by using all simulated bidirectional reflectance. Experimental result showed a good fit between the bidirectional reflectance estimated by the proposed model and the simulated value. Then, MODIS time-series reflectance data was normalized to a common sun-target-sensor geometry by the proposed model. The experimental results showed the proposed model yielded good fits between the observed and estimated values. The noise-like fluctuations in time-series reflectance data was also reduced after the sun-target-sensor normalization process.

  11. Thermodynamic functions of ion solvation in normal alcohols of aliphatic series

    International Nuclear Information System (INIS)

    Sergeeva, I.A.

    1978-01-01

    Thermodynamic functions of ion solvation of alkali, alkaline earth metals and halogenides in 9 alcohols are calculated using the earlier suggested method. It is shown that summary values are in good accord with experimental ones, the deviations do not surpass 0-5%, solvation energies of one and the same electrolyte in the series of n-alcohols do not change, enthalpy and entropy of solvation increase from lower alcohols to higher ones

  12. Hyperreal Numbers for Infinite Divergent Series

    OpenAIRE

    Bartlett, Jonathan

    2018-01-01

    Treating divergent series properly has been an ongoing issue in mathematics. However, many of the problems in divergent series stem from the fact that divergent series were discovered prior to having a number system which could handle them. The infinities that resulted from divergent series led to contradictions within the real number system, but these contradictions are largely alleviated with the hyperreal number system. Hyperreal numbers provide a framework for dealing with divergent serie...

  13. Diagnostic significance of rib series in minor thorax trauma compared to plain chest film and computed tomography.

    Science.gov (United States)

    Hoffstetter, Patrick; Dornia, Christian; Schäfer, Stephan; Wagner, Merle; Dendl, Lena M; Stroszczynski, Christian; Schreyer, Andreas G

    2014-01-01

    Rib series (RS) are a special radiological technique to improve the visualization of the bony parts of the chest. The aim of this study was to evaluate the diagnostic accuracy of rib series in minor thorax trauma. Retrospective study of 56 patients who received RS, 39 patients where additionally evaluated by plain chest film (PCF). All patients underwent a computed tomography (CT) of the chest. RS and PCF were re-read independently by three radiologists, the results were compared with the CT as goldstandard. Sensitivity, specificity, negative and positive predictive value were calculated. Significance in the differences of findings was determined by McNemar test, interobserver variability by Cohens kappa test. 56 patients were evaluated (34 men, 22 women, mean age =61 y.). In 22 patients one or more rib fracture could be identified by CT. In 18 of these cases (82%) the correct diagnosis was made by RS, in 16 cases (73%) the correct number of involved ribs was detected. These differences were significant (p = 0.03). Specificity was 100%, negative and positive predictive value were 85% and 100%. Kappa values for the interobserver agreement was 0.92-0.96. Sensitivity of PCF was 46% and was significantly lower (p = 0.008) compared to CT. Rib series does not seem to be an useful examination in evaluating minor thorax trauma. CT seems to be the method of choice to detect rib fractures, but the clinical value of the radiological proof has to be discussed and investigated in larger follow up studies.

  14. Evaluating Annual Maximum and Partial Duration Series for Estimating Frequency of Small Magnitude Floods

    Directory of Open Access Journals (Sweden)

    Fazlul Karim

    2017-06-01

    Full Text Available Understanding the nature of frequent floods is important for characterising channel morphology, riparian and aquatic habitat, and informing river restoration efforts. This paper presents results from an analysis on frequency estimates of low magnitude floods using the annual maximum and partial series data compared to actual flood series. Five frequency distribution models were fitted to data from 24 gauging stations in the Great Barrier Reef (GBR lagoon catchments in north-eastern Australia. Based on the goodness of fit test, Generalised Extreme Value, Generalised Pareto and Log Pearson Type 3 models were used to estimate flood frequencies across the study region. Results suggest frequency estimates based on a partial series are better, compared to an annual series, for small to medium floods, while both methods produce similar results for large floods. Although both methods converge at a higher recurrence interval, the convergence recurrence interval varies between catchments. Results also suggest frequency estimates vary slightly between two or more partial series, depending on flood threshold, and the differences are large for the catchments that experience less frequent floods. While a partial series produces better frequency estimates, it can underestimate or overestimate the frequency if the flood threshold differs largely compared to bankfull discharge. These results have significant implications in calculating the dependency of floodplain ecosystems on the frequency of flooding and their subsequent management.

  15. Importance of human values of personnel in the contemporary organization

    Directory of Open Access Journals (Sweden)

    Şerb Diana

    2016-06-01

    Full Text Available With the theme of importance of human values in contemporary organizations staff this article has two parts: the theoretical and practical part. The first part presented the concept of human values knowledge. In part two of the article we made of an office research based on the analysis of secondary sources. Ananalysis approached from two perspectives: at the European level and at national level. The assumption behind this article is that human values are essential in the workplace.. Data was retrieved and processed in Excel, and SPSS. In order to test research hypotheses correlation was used. To support the argument we used a series of tables and representative images. The conclusions of this analysis show that the Romanian and European respondents consider important the following human values: creativity and freedom of decision.

  16. Methods for summing general Kapteyn series

    Energy Technology Data Exchange (ETDEWEB)

    Tautz, R C [Zentrum fuer Astronomie und Astrophysik, Technische Universitaet Berlin, Hardenbergstrasse 36, D-10623 Berlin (Germany); Lerche, I [Institut fuer Geowissenschaften, Naturwissenschaftliche Fakultaet III, Martin-Luther-Universitaet Halle, D-06099 Halle (Germany); Dominici, D, E-mail: rct@gmx.eu, E-mail: lercheian@yahoo.com, E-mail: dominicd@newpaltz.edu [Department of Mathematics, State University of New York at New Paltz, 1 Hawk Dr, New Paltz, NY 12561-2443 (United States)

    2011-09-23

    The general features and characteristics of Kapteyn series, which are a special type of series involving the Bessel function, are investigated. For many applications in physics, astrophysics and mathematics, it is crucial to have closed-form expressions in order to determine their functional structure and parametric behavior. The closed-form expressions of Kapteyn series have mostly been limited to special cases, even though there are often similarities in the approaches used to reduce the series to analytically tractable forms. The goal of this paper is to review the previous work in the area and to show that Kapteyn series can be expressed as trigonometric or gamma function series, which can be evaluated in a closed form for specific parameters. Two examples with a similar structure are given, showing the complexity of Kapteyn series. (paper)

  17. Multiple Indicator Stationary Time Series Models.

    Science.gov (United States)

    Sivo, Stephen A.

    2001-01-01

    Discusses the propriety and practical advantages of specifying multivariate time series models in the context of structural equation modeling for time series and longitudinal panel data. For time series data, the multiple indicator model specification improves on classical time series analysis. For panel data, the multiple indicator model…

  18. Whole body X-ray CT scanner SCT-3000T series

    International Nuclear Information System (INIS)

    Saida, Teruhiko; Takemura, Kunihiko; Suzuki, Satoru; Sato, Yukio; Kawamoto, Yasushi; Goto, Mitsuhiro; Mishina, Yukio

    1989-01-01

    The whole body CT scanner, SCT-3000T series which improve the patient through-put and the diagnostic capability, has been developed. In the SCT-3000T series CT scanners, the great reduction of the reconstruction time and the scan cycle time has been achieved by developing the special purpose hardwares for image reconstruction such as the fast front end processor, the intelligent buffer memory. In case of the SCT-3000TX routine conditions of operation, including 3.0 sec scan, table increment, image reconstruction and image filing, the scan cycle time is about 9 seconds which is the shortest value among the competitive models. Furthermore, the higher diagnostic capability has been provided with the system, by adopting the 1024 x 1024 display matrices, and by developing the diagnostic softwares such as 3-D display program, arbitrary curved plane MPR program, r-CBF measurement program and etc. (author)

  19. Nonparametric autocovariance estimation from censored time series by Gaussian imputation.

    Science.gov (United States)

    Park, Jung Wook; Genton, Marc G; Ghosh, Sujit K

    2009-02-01

    One of the most frequently used methods to model the autocovariance function of a second-order stationary time series is to use the parametric framework of autoregressive and moving average models developed by Box and Jenkins. However, such parametric models, though very flexible, may not always be adequate to model autocovariance functions with sharp changes. Furthermore, if the data do not follow the parametric model and are censored at a certain value, the estimation results may not be reliable. We develop a Gaussian imputation method to estimate an autocovariance structure via nonparametric estimation of the autocovariance function in order to address both censoring and incorrect model specification. We demonstrate the effectiveness of the technique in terms of bias and efficiency with simulations under various rates of censoring and underlying models. We describe its application to a time series of silicon concentrations in the Arctic.

  20. Modeling Philippine Stock Exchange Composite Index Using Time Series Analysis

    Science.gov (United States)

    Gayo, W. S.; Urrutia, J. D.; Temple, J. M. F.; Sandoval, J. R. D.; Sanglay, J. E. A.

    2015-06-01

    This study was conducted to develop a time series model of the Philippine Stock Exchange Composite Index and its volatility using the finite mixture of ARIMA model with conditional variance equations such as ARCH, GARCH, EG ARCH, TARCH and PARCH models. Also, the study aimed to find out the reason behind the behaviorof PSEi, that is, which of the economic variables - Consumer Price Index, crude oil price, foreign exchange rate, gold price, interest rate, money supply, price-earnings ratio, Producers’ Price Index and terms of trade - can be used in projecting future values of PSEi and this was examined using Granger Causality Test. The findings showed that the best time series model for Philippine Stock Exchange Composite index is ARIMA(1,1,5) - ARCH(1). Also, Consumer Price Index, crude oil price and foreign exchange rate are factors concluded to Granger cause Philippine Stock Exchange Composite Index.

  1. Forecasting air quality time series using deep learning.

    Science.gov (United States)

    Freeman, Brian S; Taylor, Graham; Gharabaghi, Bahram; Thé, Jesse

    2018-04-13

    This paper presents one of the first applications of deep learning (DL) techniques to predict air pollution time series. Air quality management relies extensively on time series data captured at air monitoring stations as the basis of identifying population exposure to airborne pollutants and determining compliance with local ambient air standards. In this paper, 8 hr averaged surface ozone (O 3 ) concentrations were predicted using deep learning consisting of a recurrent neural network (RNN) with long short-term memory (LSTM). Hourly air quality and meteorological data were used to train and forecast values up to 72 hours with low error rates. The LSTM was able to forecast the duration of continuous O 3 exceedances as well. Prior to training the network, the dataset was reviewed for missing data and outliers. Missing data were imputed using a novel technique that averaged gaps less than eight time steps with incremental steps based on first-order differences of neighboring time periods. Data were then used to train decision trees to evaluate input feature importance over different time prediction horizons. The number of features used to train the LSTM model was reduced from 25 features to 5 features, resulting in improved accuracy as measured by Mean Absolute Error (MAE). Parameter sensitivity analysis identified look-back nodes associated with the RNN proved to be a significant source of error if not aligned with the prediction horizon. Overall, MAE's less than 2 were calculated for predictions out to 72 hours. Novel deep learning techniques were used to train an 8-hour averaged ozone forecast model. Missing data and outliers within the captured data set were replaced using a new imputation method that generated calculated values closer to the expected value based on the time and season. Decision trees were used to identify input variables with the greatest importance. The methods presented in this paper allow air managers to forecast long range air pollution

  2. Summability of alterations of convergent series

    Directory of Open Access Journals (Sweden)

    T. A. Keagy

    1981-01-01

    Full Text Available The effect of splitting, rearrangement, and grouping series alterations on the summability of a convergent series by ℓ−ℓ and cs−cs matrix methods is studied. Conditions are determined that guarantee the existence of alterations that are transformed into divergent series and into series with preassigned sums.

  3. Uranium series disequilibrium studies at the Broubster analogue site

    International Nuclear Information System (INIS)

    Longworth, G.; Ivanovich, M.; Wilkins, M.A.

    1989-09-01

    Uranium series measurements at a natural analogue site at Broubster, Caithness have been used to investigate radionuclide migration over a period of several hundred to 10 6 years. The measured values for the uranium concentration and activity ratios 234 U/ 238 U and 230 Th/ 234 U indicate that the geochemical system is more complicated than that originally proposed of uranium dispersion and water transport into a peat bog. There appears to be little thorium mobility although there is evidence for an appreciable fraction of thorium on the colloidal phases. (author)

  4. Fulltext PDF

    Indian Academy of Sciences (India)

    IAS Admin

    International subscriptions are processed by Springer (www.springer.com). For details contact: Springer Distribution Centre GmbH, Customer Service Journals. Haberstrasse 7, D-69126 Heidelberg, Germany. The Americas (North, South, Central and the Caribbean): journals-ny@springer.com. Outside the Americas: ...

  5. Pramana – Journal of Physics | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    International subscriptions are processed by Springer (www.springer.com). For details contact: Springer Distribution Centre GmbH, Customer Service Journals, Haberstrasse 7, D-69126 Heidelberg, Germany. The Americas (North, South, Central and the Caribbean): journals-ny@springer.com. Outside the Americas: ...

  6. Using Landsat Spectral Indices in Time-Series to Assess Wildfire Disturbance and Recovery

    Directory of Open Access Journals (Sweden)

    Samuel Hislop

    2018-03-01

    Full Text Available Satellite earth observation is being increasingly used to monitor forests across the world. Freely available Landsat data stretching back four decades, coupled with advances in computer processing capabilities, has enabled new time-series techniques for analyzing forest change. Typically, these methods track individual pixel values over time, through the use of various spectral indices. This study examines the utility of eight spectral indices for characterizing fire disturbance and recovery in sclerophyll forests, in order to determine their relative merits in the context of Landsat time-series. Although existing research into Landsat indices is comprehensive, this study presents a new approach, by comparing the distributions of pre and post-fire pixels using Glass’s delta, for evaluating indices without the need of detailed field information. Our results show that in the sclerophyll forests of southeast Australia, common indices, such as the Normalized Difference Vegetation Index (NDVI and the Normalized Burn Ratio (NBR, both accurately capture wildfire disturbance in a pixel-based time-series approach, especially if images from soon after the disturbance are available. However, for tracking forest regrowth and recovery, indices, such as NDVI, which typically capture chlorophyll concentration or canopy ‘greenness’, are not as reliable, with values returning to pre-fire levels in 3–5 years. In comparison, indices that are more sensitive to forest moisture and structure, such as NBR, indicate much longer (8–10 years recovery timeframes. This finding is consistent with studies that were conducted in other forest types. We also demonstrate that additional information regarding forest condition, particularly in relation to recovery, can be extracted from less well known indices, such as NBR2, as well as textural indices incorporating spatial variance. With Landsat time-series gaining in popularity in recent years, it is critical to

  7. Organic Aerosol Component (OACOMP) Value-Added Product Report

    Energy Technology Data Exchange (ETDEWEB)

    Fast, J; Zhang, Q; Tilp, A; Shippert, T; Parworth, C; Mei, F

    2013-08-23

    Significantly improved returns in their aerosol chemistry data can be achieved via the development of a value-added product (VAP) of deriving OA components, called Organic Aerosol Components (OACOMP). OACOMP is primarily based on multivariate analysis of the measured organic mass spectral matrix. The key outputs of OACOMP are the concentration time series and the mass spectra of OA factors that are associated with distinct sources, formation and evolution processes, and physicochemical properties.

  8. Advanced methods for modeling water-levels and estimating drawdowns with SeriesSEE, an Excel add-in

    Science.gov (United States)

    Halford, Keith; Garcia, C. Amanda; Fenelon, Joe; Mirus, Benjamin B.

    2012-12-21

    Water-level modeling is used for multiple-well aquifer tests to reliably differentiate pumping responses from natural water-level changes in wells, or “environmental fluctuations.” Synthetic water levels are created during water-level modeling and represent the summation of multiple component fluctuations, including those caused by environmental forcing and pumping. Pumping signals are modeled by transforming step-wise pumping records into water-level changes by using superimposed Theis functions. Water-levels can be modeled robustly with this Theis-transform approach because environmental fluctuations and pumping signals are simulated simultaneously. Water-level modeling with Theis transforms has been implemented in the program SeriesSEE, which is a Microsoft® Excel add-in. Moving average, Theis, pneumatic-lag, and gamma functions transform time series of measured values into water-level model components in SeriesSEE. Earth tides and step transforms are additional computed water-level model components. Water-level models are calibrated by minimizing a sum-of-squares objective function where singular value decomposition and Tikhonov regularization stabilize results. Drawdown estimates from a water-level model are the summation of all Theis transforms minus residual differences between synthetic and measured water levels. The accuracy of drawdown estimates is limited primarily by noise in the data sets, not the Theis-transform approach. Drawdowns much smaller than environmental fluctuations have been detected across major fault structures, at distances of more than 1 mile from the pumping well, and with limited pre-pumping and recovery data at sites across the United States. In addition to water-level modeling, utilities exist in SeriesSEE for viewing, cleaning, manipulating, and analyzing time-series data.

  9. On statistical inference in time series analysis of the evolution of road safety.

    Science.gov (United States)

    Commandeur, Jacques J F; Bijleveld, Frits D; Bergel-Hayat, Ruth; Antoniou, Constantinos; Yannis, George; Papadimitriou, Eleonora

    2013-11-01

    Data collected for building a road safety observatory usually include observations made sequentially through time. Examples of such data, called time series data, include annual (or monthly) number of road traffic accidents, traffic fatalities or vehicle kilometers driven in a country, as well as the corresponding values of safety performance indicators (e.g., data on speeding, seat belt use, alcohol use, etc.). Some commonly used statistical techniques imply assumptions that are often violated by the special properties of time series data, namely serial dependency among disturbances associated with the observations. The first objective of this paper is to demonstrate the impact of such violations to the applicability of standard methods of statistical inference, which leads to an under or overestimation of the standard error and consequently may produce erroneous inferences. Moreover, having established the adverse consequences of ignoring serial dependency issues, the paper aims to describe rigorous statistical techniques used to overcome them. In particular, appropriate time series analysis techniques of varying complexity are employed to describe the development over time, relating the accident-occurrences to explanatory factors such as exposure measures or safety performance indicators, and forecasting the development into the near future. Traditional regression models (whether they are linear, generalized linear or nonlinear) are shown not to naturally capture the inherent dependencies in time series data. Dedicated time series analysis techniques, such as the ARMA-type and DRAG approaches are discussed next, followed by structural time series models, which are a subclass of state space methods. The paper concludes with general recommendations and practice guidelines for the use of time series models in road safety research. Copyright © 2012 Elsevier Ltd. All rights reserved.

  10. A New Numerical Algorithm for Two-Point Boundary Value Problems

    OpenAIRE

    Guo, Lihua; Wu, Boying; Zhang, Dazhi

    2014-01-01

    We present a new numerical algorithm for two-point boundary value problems. We first present the exact solution in the form of series and then prove that the n-term numerical solution converges uniformly to the exact solution. Furthermore, we establish the numerical stability and error analysis. The numerical results show the effectiveness of the proposed algorithm.

  11. Asymptotic series and functional integrals in quantum field theory

    International Nuclear Information System (INIS)

    Shirkov, D.V.

    1979-01-01

    Investigations of the methods for analyzing ultra-violet and infrared asymptotics in the quantum field theory (QFT) have been reviewed. A powerful method of the QFT analysis connected with the group property of renormalized transformations has been created at the first stage. The result of the studies of the second period is the constructive solution of the problem of outgoing the framework of weak coupling. At the third stage of studies essential are the asymptotic series and functional integrals in the QFT, which are used for obtaining the asymptotic type of the power expansion coefficients in the coupling constant at high values of the exponents for a number of simple models. Further advance to higher values of the coupling constant requires surmounting the difficulties resulting from the asymptotic character of expansions and a constructive application in the region of strong coupling (g >> 1)

  12. Automated Bayesian model development for frequency detection in biological time series

    Directory of Open Access Journals (Sweden)

    Oldroyd Giles ED

    2011-06-01

    the requirement for uniformly sampled data. Biological time series often deviate significantly from the requirements of optimality for Fourier transformation. In this paper we present an alternative approach based on Bayesian inference. We show the value of placing spectral analysis in the framework of Bayesian inference and demonstrate how model comparison can automate this procedure.

  13. Automated Bayesian model development for frequency detection in biological time series.

    Science.gov (United States)

    Granqvist, Emma; Oldroyd, Giles E D; Morris, Richard J

    2011-06-24

    series often deviate significantly from the requirements of optimality for Fourier transformation. In this paper we present an alternative approach based on Bayesian inference. We show the value of placing spectral analysis in the framework of Bayesian inference and demonstrate how model comparison can automate this procedure.

  14. Application of the Laplace transform method for computational modelling of radioactive decay series

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Deise L.; Damasceno, Ralf M.; Barros, Ricardo C. [Univ. do Estado do Rio de Janeiro (IME/UERJ) (Brazil). Programa de Pos-graduacao em Ciencias Computacionais

    2012-03-15

    It is well known that when spent fuel is removed from the core, it is still composed of considerable amount of radioactive elements with significant half-lives. Most actinides, in particular plutonium, fall into this category, and have to be safely disposed of. One solution is to store the long-lived spent fuel as it is, by encasing and burying it deep underground in a stable geological formation. This implies estimating the transmutation of these radioactive elements with time. Therefore, we describe in this paper the application of the Laplace transform technique in matrix formulation to analytically solve initial value problems that mathematically model radioactive decay series. Given the initial amount of each type of radioactive isotopes in the decay series, the computer code generates the amount at a given time of interest, or may plot a graph of the evolution in time of the amount of each type of isotopes in the series. This computer code, that we refer to as the LTRad{sub L} code, where L is the number of types of isotopes belonging to the series, was developed using the Scilab free platform for numerical computation and can model one segment or the entire chain of any of the three radioactive series existing on Earth today. Numerical results are given to typical model problems to illustrate the computer code efficiency and accuracy. (orig.)

  15. Application of the Laplace transform method for computational modelling of radioactive decay series

    International Nuclear Information System (INIS)

    Oliveira, Deise L.; Damasceno, Ralf M.; Barros, Ricardo C.

    2012-01-01

    It is well known that when spent fuel is removed from the core, it is still composed of considerable amount of radioactive elements with significant half-lives. Most actinides, in particular plutonium, fall into this category, and have to be safely disposed of. One solution is to store the long-lived spent fuel as it is, by encasing and burying it deep underground in a stable geological formation. This implies estimating the transmutation of these radioactive elements with time. Therefore, we describe in this paper the application of the Laplace transform technique in matrix formulation to analytically solve initial value problems that mathematically model radioactive decay series. Given the initial amount of each type of radioactive isotopes in the decay series, the computer code generates the amount at a given time of interest, or may plot a graph of the evolution in time of the amount of each type of isotopes in the series. This computer code, that we refer to as the LTRad L code, where L is the number of types of isotopes belonging to the series, was developed using the Scilab free platform for numerical computation and can model one segment or the entire chain of any of the three radioactive series existing on Earth today. Numerical results are given to typical model problems to illustrate the computer code efficiency and accuracy. (orig.)

  16. Sr isotopes and U series radionuclides in the Sangemini area (Central Italy: Hydrogeology implications

    Directory of Open Access Journals (Sweden)

    Maurizio Barbieri

    2014-06-01

    Full Text Available The strontium isotopic ratio (expressed as 87Sr/86Sr of groundwater represents a useful method for studying and understanding the groundwater circulations, also, the U and Ra isotopic compositions can vary as function of the groundwater residence time. This paper reports an evaluation of the probable recharge area of the Sangemini mineral water springs (Terni-Umbria Central Italy and an estimate of the residence time of the aquifer by coupling Sr and U series isotopic systematics. For this study have been analyzed four water sample for the isotope ratio of 87Sr/86Sr, and eleven samples, shallow waters and groundwaters, for U and Ra, furthermore were determined the values of isotopic ratio for sample of typical rocks of the area. The results of this study allow to identify: a recharge area in a restricted sector of the Meso-Cenozoic carbonates a longer and more effective water/rock interaction in the Quaternary series. U and Ra recoil models allow to estimate a groundwater residence time of about 350 years and a total water volume whose value (64*106 m3 agrees with the limited extension of the aquifer. The extension of the aquifer was constrained by comparing Sr isotopic composition of waters and local geological formations. Groundwaters seem mainly to circulate in clayey sandy Quaternary series characterized by low redox conditions.

  17. International Work-Conference on Time Series

    CERN Document Server

    Pomares, Héctor

    2016-01-01

    This volume presents selected peer-reviewed contributions from The International Work-Conference on Time Series, ITISE 2015, held in Granada, Spain, July 1-3, 2015. It discusses topics in time series analysis and forecasting, advanced methods and online learning in time series, high-dimensional and complex/big data time series as well as forecasting in real problems. The International Work-Conferences on Time Series (ITISE) provide a forum for scientists, engineers, educators and students to discuss the latest ideas and implementations in the foundations, theory, models and applications in the field of time series analysis and forecasting. It focuses on interdisciplinary and multidisciplinary research encompassing the disciplines of computer science, mathematics, statistics and econometrics.

  18. Principal components and iterative regression analysis of geophysical series: Application to Sunspot number (1750 2004)

    Science.gov (United States)

    Nordemann, D. J. R.; Rigozo, N. R.; de Souza Echer, M. P.; Echer, E.

    2008-11-01

    We present here an implementation of a least squares iterative regression method applied to the sine functions embedded in the principal components extracted from geophysical time series. This method seems to represent a useful improvement for the non-stationary time series periodicity quantitative analysis. The principal components determination followed by the least squares iterative regression method was implemented in an algorithm written in the Scilab (2006) language. The main result of the method is to obtain the set of sine functions embedded in the series analyzed in decreasing order of significance, from the most important ones, likely to represent the physical processes involved in the generation of the series, to the less important ones that represent noise components. Taking into account the need of a deeper knowledge of the Sun's past history and its implication to global climate change, the method was applied to the Sunspot Number series (1750-2004). With the threshold and parameter values used here, the application of the method leads to a total of 441 explicit sine functions, among which 65 were considered as being significant and were used for a reconstruction that gave a normalized mean squared error of 0.146.

  19. Kolmogorov Space in Time Series Data

    OpenAIRE

    Kanjamapornkul, K.; Pinčák, R.

    2016-01-01

    We provide the proof that the space of time series data is a Kolmogorov space with $T_{0}$-separation axiom using the loop space of time series data. In our approach we define a cyclic coordinate of intrinsic time scale of time series data after empirical mode decomposition. A spinor field of time series data comes from the rotation of data around price and time axis by defining a new extradimension to time series data. We show that there exist hidden eight dimensions in Kolmogorov space for ...

  20. Attractors, bifurcations, & chaos nonlinear phenomena in economics

    CERN Document Server

    Puu, Tönu

    2003-01-01

    The present book relies on various editions of my earlier book "Nonlinear Economic Dynamics", first published in 1989 in the Springer series "Lecture Notes in Economics and Mathematical Systems", and republished in three more, successively revised and expanded editions, as a Springer monograph, in 1991, 1993, and 1997, and in a Russian translation as "Nelineynaia Economicheskaia Dinamica". The first three editions were focused on applications. The last was differ­ ent, as it also included some chapters with mathematical background mate­ rial -ordinary differential equations and iterated maps -so as to make the book self-contained and suitable as a textbook for economics students of dynamical systems. To the same pedagogical purpose, the number of illus­ trations were expanded. The book published in 2000, with the title "A ttractors, Bifurcations, and Chaos -Nonlinear Phenomena in Economics", was so much changed, that the author felt it reasonable to give it a new title. There were two new math­ ematics ch...

  1. Scanning electron microscopy physics of image formation and microanalysis

    CERN Document Server

    Reimer, Ludwig

    1985-01-01

    The aim of this book is to outline the physics of image formation, electron­ specimen interactions, imaging modes, the interpretation of micrographs and the use of quantitative modes "in scanning electron microscopy (SEM). lt forms a counterpart to Transmission Electron Microscopy (Vol. 36 of this Springer Series in Optical Sciences) . The book evolved from lectures delivered at the University of Münster and from a German text entitled Raster-Elektronenmikroskopie (Springer-Verlag), published in collaboration with my colleague Gerhard Pfefferkorn. In the introductory chapter, the principles of the SEM and of electron­ specimen interactions are described, the most important imaging modes and their associated contrast are summarized, and general aspects of eiemental analysis by x-ray and Auger electron emission are discussed. The electron gun and electron optics are discussed in Chap. 2 in order to show how an electron probe of small diameter can be formed, how the elec­ tron beam can be blanked at high fre...

  2. CERN Action on Open Access : Open Meeting on Changing the Publishing Model

    CERN Multimedia

    CERN. Geneva

    2005-01-01

    Leader of the discussion: Chief Scientific Officer Jos Engelen, CERN. Particle physicists are again contributing to change by Director-General Robert Aymar, CERN. A general presentation of the CERN policy and visions. Improving the impact of your research by Former Editor-in-Chief Alex Bradshaw, New Journal of Physics. Springer Open Choice by Chief executive officer Derk Haank, Springer. The JHEP experience by Scientific director Hector Rubinstein, JHEP. The impact of the J series, existing and coming journals: JHEP JCAP JSTAT JINST. National libraries ensuring long-term archiving of digital information speaker to be decided. Debate The Director-General is calling all CERN editors and authors to a meeting to contribute to the discussion on the direction that CERN should take in its experimentation with new publishing models. The current subscription-funded publishing model for journal articles (where access to a particular journal is granted upon payment of a subscription, often arranged by the institutional ...

  3. Framework for Evaluating the Total Value Proposition of Clean Energy Technologies

    Energy Technology Data Exchange (ETDEWEB)

    Pater, J. E.

    2006-02-01

    Conventional valuation techniques fail to include many of the financial advantages of clean energy technologies. By omitting benefits associated with risk management, emissions reductions, policy incentives, resource use, corporate social responsibility, and societal economic benefits, investors and firms sacrifice opportunities for new revenue streams and avoided costs. In an effort to identify some of these externalities, this analysis develops a total value proposition for clean energy technologies. It incorporates a series of values under each of the above categories, describing the opportunities for recapturing investments throughout the value chain. The framework may be used to create comparable value propositions for clean energy technologies supporting investment decisions, project siting, and marketing strategies. It can also be useful in policy-making decisions.

  4. Development of New Loan Payment Models with Piecewise Geometric Gradient Series

    Directory of Open Access Journals (Sweden)

    Erdal Aydemir

    2014-12-01

    Full Text Available Engineering economics plays an important role in decision making. Also, the cash flows, time value of money and interest rates are the most important research fields in mathematical finance. Generalized formulae obtained from a variety of models with the time value of money and cash flows are inadequate to solve some problems. In this study, a new generalized formulae is considered for the first time and derived from a loan payment model which is a certain number of payment amount determined by customer at the beginning of payment period and the other repayments with piecewise linear gradient series. As a result, some numerical examples with solutions are given for the developed models. 

  5. Long time series

    DEFF Research Database (Denmark)

    Hisdal, H.; Holmqvist, E.; Hyvärinen, V.

    Awareness that emission of greenhouse gases will raise the global temperature and change the climate has led to studies trying to identify such changes in long-term climate and hydrologic time series. This report, written by the......Awareness that emission of greenhouse gases will raise the global temperature and change the climate has led to studies trying to identify such changes in long-term climate and hydrologic time series. This report, written by the...

  6. Modeling Periodic Impulsive Effects on Online TV Series Diffusion.

    Science.gov (United States)

    Fu, Peihua; Zhu, Anding; Fang, Qiwen; Wang, Xi

    also represents a highly correlated analysis tool to evaluate the advertising value of TV series.

  7. Modeling Periodic Impulsive Effects on Online TV Series Diffusion.

    Directory of Open Access Journals (Sweden)

    Peihua Fu

    social communities also represents a highly correlated analysis tool to evaluate the advertising value of TV series.

  8. Modeling Periodic Impulsive Effects on Online TV Series Diffusion

    Science.gov (United States)

    Fang, Qiwen; Wang, Xi

    2016-01-01

    . The buzz in public social communities also represents a highly correlated analysis tool to evaluate the advertising value of TV series. PMID:27669520

  9. Analysis of series resonant converter with series-parallel connection

    Science.gov (United States)

    Lin, Bor-Ren; Huang, Chien-Lan

    2011-02-01

    In this study, a parallel inductor-inductor-capacitor (LLC) resonant converter series-connected on the primary side and parallel-connected on the secondary side is presented for server power supply systems. Based on series resonant behaviour, the power metal-oxide-semiconductor field-effect transistors are turned on at zero voltage switching and the rectifier diodes are turned off at zero current switching. Thus, the switching losses on the power semiconductors are reduced. In the proposed converter, the primary windings of the two LLC converters are connected in series. Thus, the two converters have the same primary currents to ensure that they can supply the balance load current. On the output side, two LLC converters are connected in parallel to share the load current and to reduce the current stress on the secondary windings and the rectifier diodes. In this article, the principle of operation, steady-state analysis and design considerations of the proposed converter are provided and discussed. Experiments with a laboratory prototype with a 24 V/21 A output for server power supply were performed to verify the effectiveness of the proposed converter.

  10. The vanishing discount problem and viscosity Mather measures. Part 2: boundary value problems

    OpenAIRE

    Ishii, Hitoshi; Mitake, Hiroyoshi; Tran, Hung V.

    2016-01-01

    In arXiv:1603.01051 (Part 1 of this series), we have introduced a variational approach to studying the vanishing discount problem for fully nonlinear, degenerate elliptic, partial differential equations in a torus. We develop this approach further here to handle boundary value problems. In particular, we establish new representation formulas for solutions of discount problems, critical values, and use them to prove convergence results for the vanishing discount problems.

  11. Patients' Values in Clinical Decision-Making.

    Science.gov (United States)

    Faggion, Clovis Mariano; Pachur, Thorsten; Giannakopoulos, Nikolaos Nikitas

    2017-09-01

    Shared decision-making involves the participation of patient and dental practitioner. Well-informed decision-making requires that both parties understand important concepts that may influence the decision. This fourth article in a series of 4 aims to discuss the importance of patients' values when a clinical decision is made. We report on how to incorporate important concepts for well-informed, shared decision-making. Here, we present patient values as an important issue, in addition to previously established topics such as the risk of bias of a study, cost-effectiveness of treatment approaches, and a comparison of therapeutic benefit with potential side effects. We provide 2 clinical examples and suggestions for a decision tree, based on the available evidence. The information reported in this article may improve the relationship between patient and dental practitioner, resulting in more well-informed clinical decisions. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. Optimization of inhibitory decision rules relative to length and coverage

    KAUST Repository

    Alsolami, Fawaz

    2012-01-01

    The paper is devoted to the study of algorithms for optimization of inhibitory rules relative to the length and coverage. In contrast with usual rules that have on the right-hand side a relation "attribute ≠ value", inhibitory rules have a relation "attribute = value" on the right-hand side. The considered algorithms are based on extensions of dynamic programming. © 2012 Springer-Verlag.

  13. Upper Gastrointestinal (GI) Series

    Science.gov (United States)

    ... standard barium upper GI series, which uses only barium a double-contrast upper GI series, which uses both air and ... evenly coat your upper GI tract with the barium. If you are having a double-contrast study, you will swallow gas-forming crystals that ...

  14. Discussion About Nonlinear Time Series Prediction Using Least Squares Support Vector Machine

    International Nuclear Information System (INIS)

    Xu Ruirui; Bian Guoxing; Gao Chenfeng; Chen Tianlun

    2005-01-01

    The least squares support vector machine (LS-SVM) is used to study the nonlinear time series prediction. First, the parameter γ and multi-step prediction capabilities of the LS-SVM network are discussed. Then we employ clustering method in the model to prune the number of the support values. The learning rate and the capabilities of filtering noise for LS-SVM are all greatly improved.

  15. Q value for the 34Cl superallowed beta decay

    International Nuclear Information System (INIS)

    Lin, S.; Brindhaban, S.A.; Barker, P.H.

    1994-01-01

    The proton separation energy of 34 Cl has been measured to be 5143.30(5) keV in a series of overlapping and self-consistent experiments involving the study of three resonances in the 33 S(p,γ) reaction, using a variety of techniques in laboratories in Auckland and Florence. This figure is in disagreement with the only other published value. By combining the proton separation energy with the only measured value for the 34 S neutron separation energy, the Q ec for the 34 Cl superallowed decay becomes 5491.48(12) keV. Some comments are offered on the associated 34 S(p,n) 34 Cl threshold energy measurement

  16. In-situ Kd values and geochemical behavior for inorganic and organic constituents of concern at the TNX Outfall Delta

    International Nuclear Information System (INIS)

    Kaplan, D.I.

    2000-01-01

    A series of tests were conducted to provide site-specific Kd values for constituents of concern at the TNX Outfall Delta Operable Unit. These Kd values can be used to calculate contaminant migration within the operable unit and are, at this time considered to be the most defensible values

  17. Evaluation of the best fit distribution for partial duration series of daily rainfall in Madinah, western Saudi Arabia

    Science.gov (United States)

    Alahmadi, F.; Rahman, N. A.; Abdulrazzak, M.

    2014-09-01

    Rainfall frequency analysis is an essential tool for the design of water related infrastructure. It can be used to predict future flood magnitudes for a given magnitude and frequency of extreme rainfall events. This study analyses the application of rainfall partial duration series (PDS) in the vast growing urban Madinah city located in the western part of Saudi Arabia. Different statistical distributions were applied (i.e. Normal, Log Normal, Extreme Value type I, Generalized Extreme Value, Pearson Type III, Log Pearson Type III) and their distribution parameters were estimated using L-moments methods. Also, different selection criteria models are applied, e.g. Akaike Information Criterion (AIC), Corrected Akaike Information Criterion (AICc), Bayesian Information Criterion (BIC) and Anderson-Darling Criterion (ADC). The analysis indicated the advantage of Generalized Extreme Value as the best fit statistical distribution for Madinah partial duration daily rainfall series. The outcome of such an evaluation can contribute toward better design criteria for flood management, especially flood protection measures.

  18. Evaluation of the best fit distribution for partial duration series of daily rainfall in Madinah, western Saudi Arabia

    Directory of Open Access Journals (Sweden)

    F. Alahmadi

    2014-09-01

    Full Text Available Rainfall frequency analysis is an essential tool for the design of water related infrastructure. It can be used to predict future flood magnitudes for a given magnitude and frequency of extreme rainfall events. This study analyses the application of rainfall partial duration series (PDS in the vast growing urban Madinah city located in the western part of Saudi Arabia. Different statistical distributions were applied (i.e. Normal, Log Normal, Extreme Value type I, Generalized Extreme Value, Pearson Type III, Log Pearson Type III and their distribution parameters were estimated using L-moments methods. Also, different selection criteria models are applied, e.g. Akaike Information Criterion (AIC, Corrected Akaike Information Criterion (AICc, Bayesian Information Criterion (BIC and Anderson-Darling Criterion (ADC. The analysis indicated the advantage of Generalized Extreme Value as the best fit statistical distribution for Madinah partial duration daily rainfall series. The outcome of such an evaluation can contribute toward better design criteria for flood management, especially flood protection measures.

  19. Rearrangement and convergence improvement of the Born series in scattering theory on the basis of orthogonal projections

    International Nuclear Information System (INIS)

    Kukulin, V.I.; Pomerantsev, V.N.

    1976-01-01

    Method of rearrangement of the Born series in scattering theory is proposed which uses the corthogonal projecting pseudopotentials (OPP) proposed recently. It is proved vigorously that the rearranged Born series will converge for all negative and small positive energy value seven in the presence of bound states. Method of correct introduction of scattering operators in orthogonal subspaces is displayed. Comparison of the OPP method with the projection technique developed by Feschbach is given. Physical applications of the method formulated are discussed

  20. Quantum trigonometric Calogero-Sutherland model, irreducible characters and Clebsch-Gordan series for the exceptional algebra E7

    International Nuclear Information System (INIS)

    Fernandez Nunez, J.; Garcia Fuertes, W.; Perelomov, A.M.

    2005-01-01

    We reexpress the quantum Calogero-Sutherland model for the Lie algebra E 7 and the particular value of the coupling constant κ=1 by using the fundamental irreducible characters of the algebra as dynamical variables. For that, we need to develop a systematic procedure to obtain all the Clebsch-Gordan series required to perform the change of variables. We describe how the resulting quantum Hamiltonian operator can be used to compute more characters and Clebsch-Gordan series for this exceptional algebra

  1. Climatology and time series of surface meteorology in Ny-Ålesund, Svalbard

    Directory of Open Access Journals (Sweden)

    M. Maturilli

    2013-04-01

    Full Text Available A consistent meteorological dataset of the Arctic site Ny-Ålesund (11.9° E, 78.9° N spanning the 18 yr-period 1 August 1993 to 31 July 2011 is presented. Instrumentation and data handling of temperature, humidity, wind and pressure measurements are described in detail. Monthly mean values are shown for all years to illustrate the interannual variability of the different parameters. Climatological mean values are given for temperature, humidity and pressure. From the climatological dataset, we also present the time series of annual mean temperature and humidity, revealing a temperature increase of +1.35 K per decade and an increase in water vapor mixing ratio of +0.22 g kg−1 per decade for the given time period, respectively. With the continuation of the presented measurements, the Ny-Ålesund high resolution time series will provide a reliable source to monitor Arctic change and retrieve trends in the future. The relevant data are provided in high temporal resolution as averages over 5 (1 min before (after 14 July 1998, respectively, placed on the PANGAEA repository (doi:10.1594/PANGAEA.793046. While 6 hourly synoptic observations in Ny-Ålesund by the Norwegian Meteorological Institute reach back to 1974 (Førland et al., 2011, the meteorological data presented here cover a shorter time period, but their high temporal resolution will be of value for atmospheric process studies on shorter time scales.

  2. Time series analysis of the behavior of brazilian natural rubber

    Directory of Open Access Journals (Sweden)

    Antônio Donizette de Oliveira

    2009-03-01

    Full Text Available The natural rubber is a non-wood product obtained of the coagulation of some lattices of forest species, being Hevea brasiliensis the main one. Native from the Amazon Region, this species was already known by the Indians before the discovery of America. The natural rubber became a product globally valued due to its multiple applications in the economy, being its almost perfect substitute the synthetic rubber derived from the petroleum. Similarly to what happens with other countless products the forecast of future prices of the natural rubber has been object of many studies. The use of models of forecast of univariate timeseries stands out as the more accurate and useful to reduce the uncertainty in the economic decision making process. This studyanalyzed the historical series of prices of the Brazilian natural rubber (R$/kg, in the Jan/99 - Jun/2006 period, in order tocharacterize the rubber price behavior in the domestic market; estimated a model for the time series of monthly natural rubberprices; and foresaw the domestic prices of the natural rubber, in the Jul/2006 - Jun/2007 period, based on the estimated models.The studied models were the ones belonging to the ARIMA family. The main results were: the domestic market of the natural rubberis expanding due to the growth of the world economy; among the adjusted models, the ARIMA (1,1,1 model provided the bestadjustment of the time series of prices of the natural rubber (R$/kg; the prognosis accomplished for the series supplied statistically adequate fittings.

  3. Cyclo-speed reducer 6000 series; Saikuro {reg_sign} gensokuki 6000 series

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-04-20

    This series was put on the market as the advanced speed reducer '6000 series' in April, 2000 after further improvement of various previous excellent features by adopting innovative technologies. Various series of this cyclo-speed reducers adopting a unique inscribed epicyclic gear mechanism reach 7 million units in sales success. Main specifications: (1) Input capacity range: 0.1-132kW, (2) Output torque: 24-68,200N(center dot)m, (3) Reduction ratio: 6-1,000,000. Features: (1) High efficiency and long life by adopting the analysis system based on the latest analytical technology, (2) Noise reduction by a maximum of nearly 6dB, and tone improvement by adopting a new tooth profile, (3) Weight reduction by a maximum of nearly 40% by adopting a motor direct-coupled mechanism. (translated by NEDO)

  4. Relations between elliptic multiple zeta values and a special derivation algebra

    International Nuclear Information System (INIS)

    Broedel, Johannes; Matthes, Nils; Schlotterer, Oliver

    2016-01-01

    We investigate relations between elliptic multiple zeta values (eMZVs) and describe a method to derive the number of indecomposable elements of given weight and length. Our method is based on representing eMZVs as iterated integrals over Eisenstein series and exploiting the connection with a special derivation algebra. Its commutator relations give rise to constraints on the iterated integrals over Eisenstein series relevant for eMZVs and thereby allow to count the indecomposable representatives. Conversely, the above connection suggests apparently new relations in the derivation algebra. Under https://tools.aei.mpg.de/emzv we provide relations for eMZVs over a wide range of weights and lengths. (paper)

  5. Updating Landsat time series of surface-reflectance composites and forest change products with new observations

    Science.gov (United States)

    Hermosilla, Txomin; Wulder, Michael A.; White, Joanne C.; Coops, Nicholas C.; Hobart, Geordie W.

    2017-12-01

    The use of time series satellite data allows for the temporally dense, systematic, transparent, and synoptic capture of land dynamics over time. Subsequent to the opening of the Landsat archive, several time series approaches for characterizing landscape change have been developed, often representing a particular analytical time window. The information richness and widespread utility of these time series data have created a need to maintain the currency of time series information via the addition of new data, as it becomes available. When an existing time series is temporally extended, it is critical that previously generated change information remains consistent, thereby not altering reported change statistics or science outcomes based on that change information. In this research, we investigate the impacts and implications of adding additional years to an existing 29-year annual Landsat time series for forest change. To do so, we undertook a spatially explicit comparison of the 29 overlapping years of a time series representing 1984-2012, with a time series representing 1984-2016. Surface reflectance values, and presence, year, and type of change were compared. We found that the addition of years to extend the time series had minimal effect on the annual surface reflectance composites, with slight band-specific differences (r ≥ 0.1) in the final years of the original time series being updated. The area of stand replacing disturbances and determination of change year are virtually unchanged for the overlapping period between the two time-series products. Over the overlapping temporal period (1984-2012), the total area of change differs by 0.53%, equating to an annual difference in change area of 0.019%. Overall, the spatial and temporal agreement of the changes detected by both time series was 96%. Further, our findings suggest that the entire pre-existing historic time series does not need to be re-processed during the update process. Critically, given the time

  6. Visibility Graph Based Time Series Analysis.

    Science.gov (United States)

    Stephen, Mutua; Gu, Changgui; Yang, Huijie

    2015-01-01

    Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it's microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq) and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks.

  7. Visibility Graph Based Time Series Analysis.

    Directory of Open Access Journals (Sweden)

    Mutua Stephen

    Full Text Available Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it's microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks.

  8. Flood Frequency Analysis For Partial Duration Series In Ganjiang River Basin

    Science.gov (United States)

    zhangli, Sun; xiufang, Zhu; yaozhong, Pan

    2016-04-01

    Accurate estimation of flood frequency is key to effective, nationwide flood damage abatement programs. The partial duration series (PDS) method is widely used in hydrologic studies because it considers all events above a certain threshold level as compared to the annual maximum series (AMS) method, which considers only the annual maximum value. However, the PDS has a drawback in that it is difficult to define the thresholds and maintain an independent and identical distribution of the partial duration time series; this drawback is discussed in this paper. The Ganjiang River is the seventh largest tributary of the Yangtze River, the longest river in China. The Ganjiang River covers a drainage area of 81,258 km2 at the Wanzhou hydrologic station as the basin outlet. In this work, 56 years of daily flow data (1954-2009) from the Wanzhou station were used to analyze flood frequency, and the Pearson-III model was employed as the hydrologic probability distribution. Generally, three tasks were accomplished: (1) the threshold of PDS by percentile rank of daily runoff was obtained; (2) trend analysis of the flow series was conducted using PDS; and (3) flood frequency analysis was conducted for partial duration flow series. The results showed a slight upward trend of the annual runoff in the Ganjiang River basin. The maximum flow with a 0.01 exceedance probability (corresponding to a 100-year flood peak under stationary conditions) was 20,000 m3/s, while that with a 0.1 exceedance probability was 15,000 m3/s. These results will serve as a guide to hydrological engineering planning, design, and management for policymakers and decision makers associated with hydrology.

  9. Scaling of Dielectric Breakdown Thresholds in Earth's and CO2-rich atmospheres: Impact for Predictions of Extraterrestrial Transient Luminous Events and Lightning Discharges

    Science.gov (United States)

    Riousset, J. A.

    2016-12-01

    Earth's atmospheric electricity manifests itself in the form of glow, corona, streamer, and leader discharges observed as Saint Elmo's fire, sprites, lightning and jets discharges, and other Transient Luminous Events (TLEs). All of these are types of dielectric breakdown, but are governed by different physics. In particular, their initiation is associated with the crossing of specific electric field thresholds: relativistic runaway, streamer propagation, conventional breakdown, or thermal runaway thresholds, some better understood than others. For example, the initiation of a lightning discharge is known to occur when the local electric field exceeds a value similar to relativistic runaway field, but the exact threshold, as well as the physical mechanisms at work, remain rather unclear to date. Scaling laws for electric fields (and other quantities) have been established by Pasko et al. [GRL, 25(12), 2123-2126, 1998] and Pasko [NATO Sci. Series, Springer, 253-311, 2006]. In this work, we develop profiles for initiation criteria in air and in other atmospheric environments. We further calculate their associated scaling laws to determine the ability to trigger lightning flashes and TLEs in our solar system. This lets us predict the likelihood of electrical discharges on, e.g., Mars, Venus and Titan, and calculate the expected electric field conditions, under which discharges have been observed on Jupiter, Saturn, Uranus, and Neptune [Leblanc et al., ISSI Spa. Sci. Series, Springer, 2008, Yair, Adv. Space Res., 50(3), 293-310, 2012]. Our results anticipate the arrival of ExoMars 2016's Schiaparelli module, which will provide the first records of electric field at the surface of the planet [Déprez et al., EGU GA, 16, 16613, 2014]. This research is also motived by the increasing probability of manned missions to Mars and the potential electrostatic hazards it may face [Yair, 2012], and by the role of electrical discharges in the creation of active radicals, some of

  10. Time series analysis of gold production in Malaysia

    Science.gov (United States)

    Muda, Nora; Hoon, Lee Yuen

    2012-05-01

    Gold is a soft, malleable, bright yellow metallic element and unaffected by air or most reagents. It is highly valued as an asset or investment commodity and is extensively used in jewellery, industrial application, dentistry and medical applications. In Malaysia, gold mining is limited in several areas such as Pahang, Kelantan, Terengganu, Johor and Sarawak. The main purpose of this case study is to obtain a suitable model for the production of gold in Malaysia. The model can also be used to predict the data of Malaysia's gold production in the future. Box-Jenkins time series method was used to perform time series analysis with the following steps: identification, estimation, diagnostic checking and forecasting. In addition, the accuracy of prediction is tested using mean absolute percentage error (MAPE). From the analysis, the ARIMA (3,1,1) model was found to be the best fitted model with MAPE equals to 3.704%, indicating the prediction is very accurate. Hence, this model can be used for forecasting. This study is expected to help the private and public sectors to understand the gold production scenario and later plan the gold mining activities in Malaysia.

  11. Demands, values, and burnout: relevance for physicians.

    Science.gov (United States)

    Leiter, Michael P; Frank, Erica; Matheson, Timothy J

    2009-12-01

    T o explore the interaction between workload and values congruence (personal values with health care system values) in the context of burnout and physician engagement and to explore the relative importance of these factors by sex, given the distinct work patterns of male and female physicians. National mailed survey. Canada. A random sample of 8100 Canadian physicians (response rate 40%, N = 3213); 2536 responses (from physicians working more than 35 hours per week) were analyzed. Levels of burnout, values congruence, and workload, by sex, measured by the Maslach Burnout Inventory-General Scale and the Areas of Worklife Scale. Results showed a moderate level of burnout among Canadian physicians, with relatively positive scores on exhaustion, average scores on cynicism, and mildly negative scores on professional efficacy. A series of multiple regression analyses confirmed parallel main effect contributions from manageable workload and values congruence. Both workload and values congruence predicted exhaustion and cynicism for men and women (P = .001). Only values congruence provided a significant prediction of professional efficacy for both men and women (P = .001) These predictors interacted for women on all 3 aspects of burnout (exhaustion, cynicism, and diminished efficacy). Howevever, overall levels of the burnout indicators departed only modestly from normative levels. W orkload and values congruence make distinct contributions to physician burnout. Work overload contributes to predicting exhaustion and cynicism; professional values crises contribute to predicting exhaustion, cynicism, and low professional efficacy. The interaction of values and workload for women in particular has implications for the distinct work-life patterns of male and female physicians. Specifically, the congruence of individual values with values inherent in the health care system appeared to be of greater consequence for women than for men.

  12. Graphical Data Analysis on the Circle: Wrap-Around Time Series Plots for (Interrupted) Time Series Designs.

    Science.gov (United States)

    Rodgers, Joseph Lee; Beasley, William Howard; Schuelke, Matthew

    2014-01-01

    Many data structures, particularly time series data, are naturally seasonal, cyclical, or otherwise circular. Past graphical methods for time series have focused on linear plots. In this article, we move graphical analysis onto the circle. We focus on 2 particular methods, one old and one new. Rose diagrams are circular histograms and can be produced in several different forms using the RRose software system. In addition, we propose, develop, illustrate, and provide software support for a new circular graphical method, called Wrap-Around Time Series Plots (WATS Plots), which is a graphical method useful to support time series analyses in general but in particular in relation to interrupted time series designs. We illustrate the use of WATS Plots with an interrupted time series design evaluating the effect of the Oklahoma City bombing on birthrates in Oklahoma County during the 10 years surrounding the bombing of the Murrah Building in Oklahoma City. We compare WATS Plots with linear time series representations and overlay them with smoothing and error bands. Each method is shown to have advantages in relation to the other; in our example, the WATS Plots more clearly show the existence and effect size of the fertility differential.

  13. SUSTAINABLE ALLOY DESIGN: SEARCHING FOR RARE EARTH ELEMENT ALTERNATIVES THROUGH CRYSTAL ENGINEERING

    Science.gov (United States)

    2016-02-26

    Oni, X. Sang, S. V. Raju, S. Dumpala, S. Broderick , A. Kumar, S. Sinnott, S. Saxena, 
K. Rajan, and J. M. LeBeau, Applied Physics Letters 106...Discovery and Design Eds. T. Lookman, F. Alexander and K.Rajan: Springer Series in Materials Science (2016) 2. S.R. Broderick and K. Rajan “Discovering...Cr-Co-Zn-Mn alloy” Intermetallics 68, 107-112 (2016) 5. O.Wodo , S. Broderick and K. Rajan, “ Microstructural Informatics to Driven Rational

  14. Relevance between the degree of industrial competition and fair value information: Study on the listed companies in China

    OpenAIRE

    Xuemin Zhuang; Yonggen Luo

    2015-01-01

    Purpose: The purpose of this article is to study whether there exists natural relationship between fair value and corporate external market. A series of special phenomenon in the application of fair value arouses our research interests, which present evidences on how competition affects the correlation of fair value information. Design/methodology/approach: this thesis chooses fair value changes gains and losses and calculate the ratio of DFVPSit as the alternative variable of the fair value....

  15. A Revision of Clausius Work on the Second Law. 2. On the Values of Clausius Transformations

    Directory of Open Access Journals (Sweden)

    José C. Iñiguez

    1999-10-01

    Full Text Available Abstract: The values associated to Clausius transformations are obtained through an analysis different to that of Clausius inasmuch it does not introduce any assumption or condition as to what the combined value of the transformations in a reversible cyclical process should be. The values thus obtained allow for the identification of the flaw in Clausius work leading to the inconsistency discussed in part I of this series.

  16. Visualizing the Geometric Series.

    Science.gov (United States)

    Bennett, Albert B., Jr.

    1989-01-01

    Mathematical proofs often leave students unconvinced or without understanding of what has been proved, because they provide no visual-geometric representation. Presented are geometric models for the finite geometric series when r is a whole number, and the infinite geometric series when r is the reciprocal of a whole number. (MNS)

  17. Stochastic models for time series

    CERN Document Server

    Doukhan, Paul

    2018-01-01

    This book presents essential tools for modelling non-linear time series. The first part of the book describes the main standard tools of probability and statistics that directly apply to the time series context to obtain a wide range of modelling possibilities. Functional estimation and bootstrap are discussed, and stationarity is reviewed. The second part describes a number of tools from Gaussian chaos and proposes a tour of linear time series models. It goes on to address nonlinearity from polynomial or chaotic models for which explicit expansions are available, then turns to Markov and non-Markov linear models and discusses Bernoulli shifts time series models. Finally, the volume focuses on the limit theory, starting with the ergodic theorem, which is seen as the first step for statistics of time series. It defines the distributional range to obtain generic tools for limit theory under long or short-range dependences (LRD/SRD) and explains examples of LRD behaviours. More general techniques (central limit ...

  18. Assessment of values in university students

    Directory of Open Access Journals (Sweden)

    Francisco Manuel MORALES RODRÍGUEZ

    2014-03-01

    Full Text Available This paper reports the results of a questionnaire for assessing social values in university students (VASOL. Increasingly, society demands that its professionals must know how to cope with complexity, considering the human and social aspects of such situations. The European Higher Education Area (ehea has emphasized the interest in training future professionals as agents of social change, not only as regards the creation and management of new knowledge but also in the action of citizens who contribute to greater social cohesion. This research team has developed a new questionnaire to assess social justice and solidarity values. The questionnaire revealed a unifactorial configuration coherent with the theory. A sample of 945 university students completed the VASOL and these were subjected to a series of instruments aimed at evaluating the validity of the questionnaire. The VASOL proved to be a reliable and valid instrument. We discuss the usefulness of this new instrument for the screening of social justice and solidarity values, specifically for their detection, and for assessing social or interpersonal skills in the current model of the ehea and validation of psycho-educational programs.

  19. A Filtering of Incomplete GNSS Position Time Series with Probabilistic Principal Component Analysis

    Science.gov (United States)

    Gruszczynski, Maciej; Klos, Anna; Bogusz, Janusz

    2018-04-01

    For the first time, we introduced the probabilistic principal component analysis (pPCA) regarding the spatio-temporal filtering of Global Navigation Satellite System (GNSS) position time series to estimate and remove Common Mode Error (CME) without the interpolation of missing values. We used data from the International GNSS Service (IGS) stations which contributed to the latest International Terrestrial Reference Frame (ITRF2014). The efficiency of the proposed algorithm was tested on the simulated incomplete time series, then CME was estimated for a set of 25 stations located in Central Europe. The newly applied pPCA was compared with previously used algorithms, which showed that this method is capable of resolving the problem of proper spatio-temporal filtering of GNSS time series characterized by different observation time span. We showed, that filtering can be carried out with pPCA method when there exist two time series in the dataset having less than 100 common epoch of observations. The 1st Principal Component (PC) explained more than 36% of the total variance represented by time series residuals' (series with deterministic model removed), what compared to the other PCs variances (less than 8%) means that common signals are significant in GNSS residuals. A clear improvement in the spectral indices of the power-law noise was noticed for the Up component, which is reflected by an average shift towards white noise from - 0.98 to - 0.67 (30%). We observed a significant average reduction in the accuracy of stations' velocity estimated for filtered residuals by 35, 28 and 69% for the North, East, and Up components, respectively. CME series were also subjected to analysis in the context of environmental mass loading influences of the filtering results. Subtraction of the environmental loading models from GNSS residuals provides to reduction of the estimated CME variance by 20 and 65% for horizontal and vertical components, respectively.

  20. Informing the Selection of Screening Hit Series with in Silico Absorption, Distribution, Metabolism, Excretion, and Toxicity Profiles.

    Science.gov (United States)

    Sanders, John M; Beshore, Douglas C; Culberson, J Christopher; Fells, James I; Imbriglio, Jason E; Gunaydin, Hakan; Haidle, Andrew M; Labroli, Marc; Mattioni, Brian E; Sciammetta, Nunzio; Shipe, William D; Sheridan, Robert P; Suen, Linda M; Verras, Andreas; Walji, Abbas; Joshi, Elizabeth M; Bueters, Tjerk

    2017-08-24

    High-throughput screening (HTS) has enabled millions of compounds to be assessed for biological activity, but challenges remain in the prioritization of hit series. While biological, absorption, distribution, metabolism, excretion, and toxicity (ADMET), purity, and structural data are routinely used to select chemical matter for further follow-up, the scarcity of historical ADMET data for screening hits limits our understanding of early hit compounds. Herein, we describe a process that utilizes a battery of in-house quantitative structure-activity relationship (QSAR) models to generate in silico ADMET profiles for hit series to enable more complete characterizations of HTS chemical matter. These profiles allow teams to quickly assess hit series for desirable ADMET properties or suspected liabilities that may require significant optimization. Accordingly, these in silico data can direct ADMET experimentation and profoundly impact the progression of hit series. Several prospective examples are presented to substantiate the value of this approach.

  1. Laplace boundary-value problem in paraboloidal coordinates

    International Nuclear Information System (INIS)

    Duggen, L; Willatzen, M; Voon, L C Lew Yan

    2012-01-01

    This paper illustrates both a problem in mathematical physics, whereby the method of separation of variables, while applicable, leads to three ordinary differential equations that remain fully coupled via two separation constants and a five-term recurrence relation for series solutions, and an exactly solvable problem in electrostatics, as a boundary-value problem on a paraboloidal surface. In spite of the complex nature of the former, it is shown that the latter solution can be quite simple. Results are provided for the equipotential surfaces and electric field lines are given near a paraboloidal conductor. (paper)

  2. Analysis of Heavy-Tailed Time Series

    DEFF Research Database (Denmark)

    Xie, Xiaolei

    This thesis is about analysis of heavy-tailed time series. We discuss tail properties of real-world equity return series and investigate the possibility that a single tail index is shared by all return series of actively traded equities in a market. Conditions for this hypothesis to be true...... are identified. We study the eigenvalues and eigenvectors of sample covariance and sample auto-covariance matrices of multivariate heavy-tailed time series, and particularly for time series with very high dimensions. Asymptotic approximations of the eigenvalues and eigenvectors of such matrices are found...... and expressed in terms of the parameters of the dependence structure, among others. Furthermore, we study an importance sampling method for estimating rare-event probabilities of multivariate heavy-tailed time series generated by matrix recursion. We show that the proposed algorithm is efficient in the sense...

  3. The Value of Children: A Cross-National Study, Volume Three. Hawaii.

    Science.gov (United States)

    Arnold, Fred; Fawcett, James T.

    The document, one in a series of seven reports from the Value of Children Project, discusses results of the survey in Hawaii. Specifically, the study investigated the social, psychological, and economic costs and benefits associated with having children. The volume is presented in seven chapters. Chapter I describes the background of the study and…

  4. Kapteyn series arising in radiation problems

    International Nuclear Information System (INIS)

    Lerche, I; Tautz, R C

    2008-01-01

    In discussing radiation from multiple point charges or magnetic dipoles, moving in circles or ellipses, a variety of Kapteyn series of the second kind arises. Some of the series have been known in closed form for a hundred years or more, others appear not to be available to analytic persuasion. This paper shows how 12 such generic series can be developed to produce either closed analytic expressions or integrals that are not analytically tractable. In addition, the method presented here may be of benefit when one has other Kapteyn series of the second kind to consider, thereby providing an additional reason to consider such series anew

  5. Uranium series isotopes concentration in sediments at San Marcos and Luis L. Leon reservoirs, Chihuahua, Mexico

    Science.gov (United States)

    Méndez-García, C.; Renteria-Villalobos, M.; García-Tenorio, R.; Montero-Cabrera, M. E.

    2014-07-01

    Spatial and temporal distribution of the radioisotopes concentrations were determined in sediments near the surface and core samples extracted from two reservoirs located in an arid region close to Chihuahua City, Mexico. At San Marcos reservoir one core was studied, while from Luis L. Leon reservoir one core from the entrance and another one close to the wall were investigated. 232Th-series, 238U-series, 40K and 137Cs activity concentrations (AC, Bq kg-1) were determined by gamma spectrometry with a high purity Ge detector. 238U and 234U ACs were obtained by liquid scintillation and alpha spectrometry with a surface barrier detector. Dating of core sediments was performed applying CRS method to 210Pb activities. Results were verified by 137Cs AC. Resulting activity concentrations were compared among corresponding surface and core sediments. High 238U-series AC values were found in sediments from San Marcos reservoir, because this site is located close to the Victorino uranium deposit. Low AC values found in Luis L. Leon reservoir suggest that the uranium present in the source of the Sacramento - Chuviscar Rivers is not transported up to the Conchos River. Activity ratios (AR) 234U/overflow="scroll">238U and 238U/overflow="scroll">226Ra in sediments have values between 0.9-1.2, showing a behavior close to radioactive equilibrium in the entire basin. 232Th/overflow="scroll">238U, 228Ra/overflow="scroll">226Ra ARs are witnesses of the different geological origin of sediments from San Marcos and Luis L. Leon reservoirs.

  6. Uranium series isotopes concentration in sediments at San Marcos and Luis L. Leon reservoirs, Chihuahua, Mexico

    International Nuclear Information System (INIS)

    Méndez-García, C.; Montero-Cabrera, M. E.; Renteria-Villalobos, M.; García-Tenorio, R.

    2014-01-01

    Spatial and temporal distribution of the radioisotopes concentrations were determined in sediments near the surface and core samples extracted from two reservoirs located in an arid region close to Chihuahua City, Mexico. At San Marcos reservoir one core was studied, while from Luis L. Leon reservoir one core from the entrance and another one close to the wall were investigated. 232 Th-series, 238 U-series, 40 K and 137 Cs activity concentrations (AC, Bq kg −1 ) were determined by gamma spectrometry with a high purity Ge detector. 238 U and 234 U ACs were obtained by liquid scintillation and alpha spectrometry with a surface barrier detector. Dating of core sediments was performed applying CRS method to 210 Pb activities. Results were verified by 137 Cs AC. Resulting activity concentrations were compared among corresponding surface and core sediments. High 238 U-series AC values were found in sediments from San Marcos reservoir, because this site is located close to the Victorino uranium deposit. Low AC values found in Luis L. Leon reservoir suggest that the uranium present in the source of the Sacramento – Chuviscar Rivers is not transported up to the Conchos River. Activity ratios (AR) 234 U/ 238 U and 238 U/ 226 Ra in sediments have values between 0.9–1.2, showing a behavior close to radioactive equilibrium in the entire basin. 232 Th/ 238 U, 228 Ra/ 226 Ra ARs are witnesses of the different geological origin of sediments from San Marcos and Luis L. Leon reservoirs

  7. International Christian Schoolteachers' Traits, Characteristics, and Qualities Valued by Third Culture Kids

    Science.gov (United States)

    Linton, Dale B.

    2015-01-01

    In this qualitative grounded theory study, 24 participants, referred to as "third culture kids" (or TCKs), ages 18-30 years, who had previously attended international Christian schools were interviewed to determine the dispositions they valued in their teachers. Incorporating principles of grounded theory, a series of rigorous steps were…

  8. Chart Series

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Centers for Medicare and Medicaid Services (CMS) offers several different Chart Series with data on beneficiary health status, spending, operations, and quality...

  9. Women's series: by women, for women?

    NARCIS (Netherlands)

    Kuitert, L.; Spiers, J.

    2011-01-01

    One of the striking phenomena in the 19th century publishing history is the abundant publication of publisher''s series. This contribution concerns series specifically meant for women. The focus is on Dutch literary series for women, mostly 19th century.

  10. Time Series with Long Memory

    OpenAIRE

    西埜, 晴久

    2004-01-01

    The paper investigates an application of long-memory processes to economic time series. We show properties of long-memory processes, which are motivated to model a long-memory phenomenon in economic time series. An FARIMA model is described as an example of long-memory model in statistical terms. The paper explains basic limit theorems and estimation methods for long-memory processes in order to apply long-memory models to economic time series.

  11. Time Series Momentum

    DEFF Research Database (Denmark)

    Moskowitz, Tobias J.; Ooi, Yao Hua; Heje Pedersen, Lasse

    2012-01-01

    We document significant “time series momentum” in equity index, currency, commodity, and bond futures for each of the 58 liquid instruments we consider. We find persistence in returns for one to 12 months that partially reverses over longer horizons, consistent with sentiment theories of initial...... under-reaction and delayed over-reaction. A diversified portfolio of time series momentum strategies across all asset classes delivers substantial abnormal returns with little exposure to standard asset pricing factors and performs best during extreme markets. Examining the trading activities...

  12. Fuzzy time series forecasting model with natural partitioning length approach for predicting the unemployment rate under different degree of confidence

    Science.gov (United States)

    Ramli, Nazirah; Mutalib, Siti Musleha Ab; Mohamad, Daud

    2017-08-01

    Fuzzy time series forecasting model has been proposed since 1993 to cater for data in linguistic values. Many improvement and modification have been made to the model such as enhancement on the length of interval and types of fuzzy logical relation. However, most of the improvement models represent the linguistic term in the form of discrete fuzzy sets. In this paper, fuzzy time series model with data in the form of trapezoidal fuzzy numbers and natural partitioning length approach is introduced for predicting the unemployment rate. Two types of fuzzy relations are used in this study which are first order and second order fuzzy relation. This proposed model can produce the forecasted values under different degree of confidence.

  13. Fourier Series Optimization Opportunity

    Science.gov (United States)

    Winkel, Brian

    2008-01-01

    This note discusses the introduction of Fourier series as an immediate application of optimization of a function of more than one variable. Specifically, it is shown how the study of Fourier series can be motivated to enrich a multivariable calculus class. This is done through discovery learning and use of technology wherein students build the…

  14. Extreme values of meteorological parameters observed at Kalpakkam during the period 1968-1999

    International Nuclear Information System (INIS)

    Balagurunathan, M.R.; Chandresekharan, E.; Rajan, M.P.; Gurg, R.P.

    2001-05-01

    In the design phase of engineering structures, an understanding of extreme weather conditions that may occur at the site of interest is very essential, so that the structures can be designed to withstand climatological stresses during its life time. In this report an analysis of extreme values of meteorological parameters at Kalpakkam for the period 1968-99, which provide an insight into such situations is described. The extreme value analysis reveals that all the variables obey Fisher-Tippet Type-I extreme value distribution function. Parameter values of extreme value analysis functions are presented for the variables studied and the 50- and 100- year return period extreme values are arrived at. Frequency distribution of rainfall parameters is investigated. Time series of annual rainfall data suggests a cycle of 2-3 years period. (author)

  15. A-integrable martingale sequences and Walsh series

    International Nuclear Information System (INIS)

    Skvortsov, V A

    2001-01-01

    A sufficient condition for a Walsh series converging to an A-integrable function f to be the A-Fourier's series of f is stated in terms of uniform A-integrability of a martingale subsequence of partial sums of the Walsh series. Moreover, the existence is proved of a Walsh series that converges almost everywhere to an A-integrable function and is not the A-Fourier series of its sum

  16. Wet tropospheric delays forecast based on Vienna Mapping Function time series analysis

    Science.gov (United States)

    Rzepecka, Zofia; Kalita, Jakub

    2016-04-01

    It is well known that the dry part of the zenith tropospheric delay (ZTD) is much easier to model than the wet part (ZTW). The aim of the research is applying stochastic modeling and prediction of ZTW using time series analysis tools. Application of time series analysis enables closer understanding of ZTW behavior as well as short-term prediction of future ZTW values. The ZTW data used for the studies were obtained from the GGOS service hold by Vienna technical University. The resolution of the data is six hours. ZTW for the years 2010 -2013 were adopted for the study. The International GNSS Service (IGS) permanent stations LAMA and GOPE, located in mid-latitudes, were admitted for the investigations. Initially the seasonal part was separated and modeled using periodic signals and frequency analysis. The prominent annual and semi-annual signals were removed using sines and consines functions. The autocorrelation of the resulting signal is significant for several days (20-30 samples). The residuals of this fitting were further analyzed and modeled with ARIMA processes. For both the stations optimal ARMA processes based on several criterions were obtained. On this basis predicted ZTW values were computed for one day ahead, leaving the white process residuals. Accuracy of the prediction can be estimated at about 3 cm.

  17. Network structure of multivariate time series.

    Science.gov (United States)

    Lacasa, Lucas; Nicosia, Vincenzo; Latora, Vito

    2015-10-21

    Our understanding of a variety of phenomena in physics, biology and economics crucially depends on the analysis of multivariate time series. While a wide range tools and techniques for time series analysis already exist, the increasing availability of massive data structures calls for new approaches for multidimensional signal processing. We present here a non-parametric method to analyse multivariate time series, based on the mapping of a multidimensional time series into a multilayer network, which allows to extract information on a high dimensional dynamical system through the analysis of the structure of the associated multiplex network. The method is simple to implement, general, scalable, does not require ad hoc phase space partitioning, and is thus suitable for the analysis of large, heterogeneous and non-stationary time series. We show that simple structural descriptors of the associated multiplex networks allow to extract and quantify nontrivial properties of coupled chaotic maps, including the transition between different dynamical phases and the onset of various types of synchronization. As a concrete example we then study financial time series, showing that a multiplex network analysis can efficiently discriminate crises from periods of financial stability, where standard methods based on time-series symbolization often fail.

  18. Possible chaoticity for the time series of the amount of nuclear information released by the newsmedia

    International Nuclear Information System (INIS)

    Ohnishi, T.

    1995-01-01

    The amount of information concerning nuclear problems was time analysed which has been released by three types of the newsmedia in Japan, the press, television and magazines during the past 20 years. The time series of the logarithmic value of the amount released by some of the newsmedia was found to be possibly chaotic, or at least to be non-stochastic. Such a characteristic of time series can be interpreted as a result of the exertion of a certain sort of selection process in the interior of the newsmedia in deciding an event as news to be released. (author)

  19. Analysis of frequency-dependent series resistance and interface states of In/SiO{sub 2}/p-Si (MIS) structures

    Energy Technology Data Exchange (ETDEWEB)

    Birkan Selcuk, A. [Department of Nuclear Electronics and Instrumentation, Saraykoey Nuclear Research and Training Center, 06983 Saray, Ankara (Turkey); Tugluoglu, N. [Department of Nuclear Electronics and Instrumentation, Saraykoey Nuclear Research and Training Center, 06983 Saray, Ankara (Turkey)], E-mail: ntuglu@taek.gov.tr; Karadeniz, S.; Bilge Ocak, S. [Department of Nuclear Electronics and Instrumentation, Saraykoey Nuclear Research and Training Center, 06983 Saray, Ankara (Turkey)

    2007-11-15

    In this work, the investigation of the interface state density and series resistance from capacitance-voltage (C-V) and conductance-voltage (G/{omega}-V) characteristics in In/SiO{sub 2}/p-Si metal-insulator-semiconductor (MIS) structures with thin interfacial insulator layer have been reported. The thickness of SiO{sub 2} film obtained from the measurement of the oxide capacitance corrected for series resistance in the strong accumulation region is 220 A. The forward and reverse bias C-V and G/{omega}-V characteristics of MIS structures have been studied at the frequency range 30 kHz-1 MHz at room temperature. The frequency dispersion in capacitance and conductance can be interpreted in terms of the series resistance (R{sub s}) and interface state density (D{sub it}) values. Both the series resistance R{sub s} and density of interface states D{sub it} are strongly frequency-dependent and decrease with increasing frequency. The distribution profile of R{sub s}-V gives a peak at low frequencies in the depletion region and disappears with increasing frequency. Experimental results show that the interfacial polarization contributes to the improvement of the dielectric properties of In/SiO{sub 2}/p-Si MIS structures. The interface state density value of In/SiO{sub 2}/p-Si MIS diode calculated at strong accumulation region is 1.11x10{sup 12} eV{sup -1} cm{sup -2} at 1 MHz. It is found that the calculated value of D{sub it} ({approx}10{sup 12} eV{sup -1} cm{sup -2}) is not high enough to pin the Fermi level of the Si substrate disrupting the device operation.

  20. Preparation of two series of materials with perovskite structure and investigation of their physical properties

    International Nuclear Information System (INIS)

    Mohamed, H.S.R.

    2010-01-01

    Results on structural, electric transport and magnetic properties of a series of (Al / In) doped Ca-series and (Al / In) doped Sr-series are presented and discussed.The polycrystalline ceramic samples were prepared by the solid state reaction technique. Elemental analysis showed a reasonable agreement between nominal and actual sample compositions. The grain size (G.S) of the Ca doped series increased with In content (G.S. (x = 0.2) = 79.5 nm and G.S. (x = 0.8) = 95.4 nm). For the Sr-series it has values in the range of 40 - 42 nm.Room temperature structural analysis using the Rietveld refinement technique,showed no structural transitions with the variation of the Al / In ratio. The doped Ca-series had an orthorhombic symmetry with space group Pnma. The Sr -doped series is rhombohedral with space group ( R3C ). In both series the Mn-O bond distance was found to increase whereas the mean Mn-O-Mn bond angle decreased with x. This was ascribed to the size mismatch between the divalent A- site ions and the B- site as a result of the introduction of the large In 3+ ion size. The tolerance factor varies from 0.918-0.933 for the Ca-series and from 0.932 - 0.948 for the Sr-series as x varies from 0.0 to 1.0. The temperature dependence of the magnetic susceptibility and electric resistivity of the Ca-doped series showed distinct ferromagnetic metallic (FMM) to a paramagnetic insulator (PMI) transitions near the Curie point (T C ), which ranges from T C ∼ 210 - 100 K for x = 0.0 to 1.0 respectively. The temperature dependence of the resistivity for the Sr-doped series showed distinct FMM to PMI transitions for samples with x = 0.0, 0.2 and 1.0, whereas samples with x = 0.4, 0.6 and 0.8 showed FMM to PMM. The transition temperature variation is not linear and lies within a narrow temperature range T p ∼ 344 - 367 K The results of the Sr-series showed that the size mismatch between the A- and B- sites is the major factor that controls the magnetic and electric properties

  1. Improving the Instruction of Infinite Series

    Science.gov (United States)

    Lindaman, Brian; Gay, A. Susan

    2012-01-01

    Calculus instructors struggle to teach infinite series, and students have difficulty understanding series and related concepts. Four instructional strategies, prominently used during the calculus reform movement, were implemented during a 3-week unit on infinite series in one class of second-semester calculus students. A description of each…

  2. Authenticating ‘Cover to Cover’ Reader Series vis-à-vis Cultural Norms for the Iranian community

    Directory of Open Access Journals (Sweden)

    Marjan Vosougi

    2015-11-01

    Full Text Available This research study was an attempt to explore hidden cultural components in an ELT textbook from Oxford University Press (OUP titled 'Cover to Cover'. Two research methodologies were relied on to unveil the western ideologies in this series: Firstly, a qualitative review over its reading textbooks was undertaken for authenticating the hidden western values for Iranian contexts. At this stage, analyses over randomly chosen units of the three-volume CTC book were managed via qualitative content analysis using inductive category formation techniques. In a second stage, a focus group including English language teachers indulged in teaching this series were interviewed to enrich the data with lived experiences. Overall, the findings revealed that the hidden values in the sampled texts might transmit some counter-local perspectives against Iranian leaners' local culture. Pedagogical suggestions as to improving critical cultural awareness practices for non-native students in the light of material development practices for EFL settings were discussed at the end.

  3. A Review of Subsequence Time Series Clustering

    Directory of Open Access Journals (Sweden)

    Seyedjamal Zolhavarieh

    2014-01-01

    Full Text Available Clustering of subsequence time series remains an open issue in time series clustering. Subsequence time series clustering is used in different fields, such as e-commerce, outlier detection, speech recognition, biological systems, DNA recognition, and text mining. One of the useful fields in the domain of subsequence time series clustering is pattern recognition. To improve this field, a sequence of time series data is used. This paper reviews some definitions and backgrounds related to subsequence time series clustering. The categorization of the literature reviews is divided into three groups: preproof, interproof, and postproof period. Moreover, various state-of-the-art approaches in performing subsequence time series clustering are discussed under each of the following categories. The strengths and weaknesses of the employed methods are evaluated as potential issues for future studies.

  4. A review of subsequence time series clustering.

    Science.gov (United States)

    Zolhavarieh, Seyedjamal; Aghabozorgi, Saeed; Teh, Ying Wah

    2014-01-01

    Clustering of subsequence time series remains an open issue in time series clustering. Subsequence time series clustering is used in different fields, such as e-commerce, outlier detection, speech recognition, biological systems, DNA recognition, and text mining. One of the useful fields in the domain of subsequence time series clustering is pattern recognition. To improve this field, a sequence of time series data is used. This paper reviews some definitions and backgrounds related to subsequence time series clustering. The categorization of the literature reviews is divided into three groups: preproof, interproof, and postproof period. Moreover, various state-of-the-art approaches in performing subsequence time series clustering are discussed under each of the following categories. The strengths and weaknesses of the employed methods are evaluated as potential issues for future studies.

  5. A Review of Subsequence Time Series Clustering

    Science.gov (United States)

    Teh, Ying Wah

    2014-01-01

    Clustering of subsequence time series remains an open issue in time series clustering. Subsequence time series clustering is used in different fields, such as e-commerce, outlier detection, speech recognition, biological systems, DNA recognition, and text mining. One of the useful fields in the domain of subsequence time series clustering is pattern recognition. To improve this field, a sequence of time series data is used. This paper reviews some definitions and backgrounds related to subsequence time series clustering. The categorization of the literature reviews is divided into three groups: preproof, interproof, and postproof period. Moreover, various state-of-the-art approaches in performing subsequence time series clustering are discussed under each of the following categories. The strengths and weaknesses of the employed methods are evaluated as potential issues for future studies. PMID:25140332

  6. Determining the Points of Change in Time Series of Polarimetric SAR Data

    DEFF Research Database (Denmark)

    Conradsen, Knut; Nielsen, Allan Aasbjerg; Skriver, Henning

    2016-01-01

    We present the likelihood ratio test statistic for the homogeneity of several complex variance–covariance matrices that may be used in order to assess whether at least one change has taken place in a time series of SAR data. Furthermore, we give a factorization of this test statistic into a produ....... The pixelwise analyses are applied on homogeneous subareas covered with different vegetation types using the distribution of the observed p-values....

  7. Detection of land cover change using an Artificial Neural Network on a time-series of MODIS satellite data

    CSIR Research Space (South Africa)

    Olivier, JC

    2007-11-01

    Full Text Available An Artificial Neural Network (ANN) is proposed to detect human-induced land cover change using a sliding window through a time-series of Moderate Resolution Imaging Spectroradiometer (MODIS) satellite surface reflectance pixel values. Training...

  8. Preparing Landsat Image Time Series (LITS for Monitoring Changes in Vegetation Phenology in Queensland, Australia

    Directory of Open Access Journals (Sweden)

    Santosh Bhandari

    2012-06-01

    Full Text Available Time series of images are required to extract and separate information on vegetation change due to phenological cycles, inter-annual climatic variability, and long-term trends. While images from the Landsat Thematic Mapper (TM sensor have the spatial and spectral characteristics suited for mapping a range of vegetation structural and compositional properties, its 16-day revisit period combined with cloud cover problems and seasonally limited latitudinal range, limit the availability of images at intervals and durations suitable for time series analysis of vegetation in many parts of the world. Landsat Image Time Series (LITS is defined here as a sequence of Landsat TM images with observations from every 16 days for a five-year period, commencing on July 2003, for a Eucalyptus woodland area in Queensland, Australia. Synthetic Landsat TM images were created using the Spatial and Temporal Adaptive Reflectance Fusion Model (STARFM algorithm for all dates when images were either unavailable or too cloudy. This was done using cloud-free scenes and a MODIS Nadir BRDF Adjusted Reflectance (NBAR product. The ability of the LITS to measure attributes of vegetation phenology was examined by: (1 assessing the accuracy of predicted image-derived Foliage Projective Cover (FPC estimates using ground-measured values; and (2 comparing the LITS-generated normalized difference vegetation index (NDVI and MODIS NDVI (MOD13Q1 time series. The predicted image-derived FPC products (value ranges from 0 to 100% had an RMSE of 5.6. Comparison between vegetation phenology parameters estimated from LITS-generated NDVI and MODIS NDVI showed no significant difference in trend and less than 16 days (equal to the composite period of the MODIS data used difference in key seasonal parameters, including start and end of season in most of the cases. In comparison to similar published work, this paper tested the STARFM algorithm in a new (broadleaf forest environment and also

  9. A regional and nonstationary model for partial duration series of extreme rainfall

    DEFF Research Database (Denmark)

    Gregersen, Ida Bülow; Madsen, Henrik; Rosbjerg, Dan

    2017-01-01

    as the explanatory variables in the regional and temporal domain, respectively. Further analysis of partial duration series with nonstationary and regional thresholds shows that the mean exceedances also exhibit a significant variation in space and time for some rainfall durations, while the shape parameter is found...... of extreme rainfall. The framework is built on a partial duration series approach with a nonstationary, regional threshold value. The model is based on generalized linear regression solved by generalized estimation equations. It allows a spatial correlation between the stations in the network and accounts...... furthermore for variable observation periods at each station and in each year. Marginal regional and temporal regression models solved by generalized least squares are used to validate and discuss the results of the full spatiotemporal model. The model is applied on data from a large Danish rain gauge network...

  10. Harmonic Analysis of a Nonstationary Series of Temperature Paleoreconstruction for the Central Part of Greenland

    Directory of Open Access Journals (Sweden)

    T.E. Danova

    2016-06-01

    Full Text Available The results of the investigations of a transformed series of reconstructed air temperature data for the central part of Greenland with an increment of 30 years have been presented. Stationarization of a ~ 50,000-years’ series of the reconstructed air temperature in the central part of Greenland according to ice core data has been performed using mathematical expectation. To obtain mathematical expectation estimation, the smoothing procedure by the methods of moving average and wavelet analysis has been carried out. Fourier’s transformation has been applied repeatedly to the stationarized series with changing the averaging time in the process of smoothing. Three averaging time values have been selected for the investigations: ~ 400–500 years, ~ 2,000 years, and ~ 4,000 years. Stationarization of the reconstructed temperature series with the help of wavelet transformation showed the best results when applying the averaging time of ~ 400 and ~ 2000 years, the trends well characterize the initial temperature series, there-by revealing the main patterns of its dynamics. Using the period with the averaging time of ~ 4,000 years showed the worst result: significant events of the main temperature series were lost in the process of averaging. The obtained results well correspond to cycling known to be inherent to the climatic system of the planet; the detected modes of 1,470 ± 500 years are comparable to the Dansgaard–Oeschger and Bond oscillations.

  11. Using NASA's Giovanni System to Simulate Time-Series Stations in the Outflow Region of California's Eel River

    Science.gov (United States)

    Acker, James G.; Shen, Suhung; Leptoukh, Gregory G.; Lee, Zhongping

    2012-01-01

    Oceanographic time-series stations provide vital data for the monitoring of oceanic processes, particularly those associated with trends over time and interannual variability. There are likely numerous locations where the establishment of a time-series station would be desirable, but for reasons of funding or logistics, such establishment may not be feasible. An alternative to an operational time-series station is monitoring of sites via remote sensing. In this study, the NASA Giovanni data system is employed to simulate the establishment of two time-series stations near the outflow region of California s Eel River, which carries a high sediment load. Previous time-series analysis of this location (Acker et al. 2009) indicated that remotely-sensed chl a exhibits a statistically significant increasing trend during summer (low flow) months, but no apparent trend during winter (high flow) months. Examination of several newly-available ocean data parameters in Giovanni, including 8-day resolution data, demonstrates the differences in ocean parameter trends at the two locations compared to regionally-averaged time-series. The hypothesis that the increased summer chl a values are related to increasing SST is evaluated, and the signature of the Eel River plume is defined with ocean optical parameters.

  12. Strategic positioning. Part 1: The sources of value under managed care.

    Science.gov (United States)

    Kauer, R T; Berkowitz, E

    1997-01-01

    Part 1 of this series organizes and discusses the sources of value against a background of an evolving managed care market. Part 2 will present, in more detail, the marketing and financial challenges to organizational positioning and performance across the four stages of managed care. What are the basic principles or tenets of value and how do they apply to the health care industry? Why is strategic positioning so important to health care organizations struggling in a managed care environment and what are the sources of value? Service motivated employees and the systems that educate them represent a stronger competitive advantage than having assets and technology that are available to anyone. As the health care marketplace evolves, organizations must develop a strategic position that will provide such value and for which the customer will be willing to pay.

  13. What marketing scholars should know about time series analysis : time series applications in marketing

    NARCIS (Netherlands)

    Horváth, Csilla; Kornelis, Marcel; Leeflang, Peter S.H.

    2002-01-01

    In this review, we give a comprehensive summary of time series techniques in marketing, and discuss a variety of time series analysis (TSA) techniques and models. We classify them in the sets (i) univariate TSA, (ii) multivariate TSA, and (iii) multiple TSA. We provide relevant marketing

  14. An elementary treatise on Fourier's series and spherical, cylindrical, and ellipsoidal harmonics, with applications to problems in mathematical

    CERN Document Server

    Byerly, William Elwood

    2003-01-01

    Originally published over a century ago, this work remains among the most useful and practical expositions of Fourier's series, and spherical, cylindrical, and ellipsoidal harmonics. The subsequent growth of science into a diverse range of specialties has enhanced the value of this classic, whose thorough, basic treatment presents material that is assumed in many other studies but seldom available in such concise form. The development of functions, series, and their differential equations receives detailed explanations, and throughout the text, theory is applied to practical problems, with the

  15. Evaluation of nuclides with closely spaced values of depletion constants in transmutation chains

    International Nuclear Information System (INIS)

    Vukadin, Z.S.

    1977-01-01

    New method of calculating nuclide concentrations in a transmutation chain is developed in this thesis. Method is based on originally derived recurrence formulas for expansion series of depletion functions and on originally obtained, nonsingular, Bateman coefficients. Explicit expression for the nuclide concentrations in a transmutation chain is obtained. This expression can be used as it stands for arbitrary values of nuclides depletion constants. By computing hypothetical transmutation chains and neptunium series, method is compared with the Bateman analytical solution, with the approximate solutions and with the matrix exponential method. It comes out that the method presented in this thesis is suitable for calculating very long depletion chains even in the case of some closely spaced and/or equal values of nuclide depletion constants. Though, presented method is of great practical applicability in a number of nuclear physics problems that are dealing with the nuclide transmutations: starting from the studies of the stellar evolution up to the design of nuclear reactors (author) [sr

  16. Time Series Analysis and Forecasting by Example

    CERN Document Server

    Bisgaard, Soren

    2011-01-01

    An intuition-based approach enables you to master time series analysis with ease Time Series Analysis and Forecasting by Example provides the fundamental techniques in time series analysis using various examples. By introducing necessary theory through examples that showcase the discussed topics, the authors successfully help readers develop an intuitive understanding of seemingly complicated time series models and their implications. The book presents methodologies for time series analysis in a simplified, example-based approach. Using graphics, the authors discuss each presented example in

  17. Economic value of dengue vaccine in Thailand.

    Science.gov (United States)

    Lee, Bruce Y; Connor, Diana L; Kitchen, Sarah B; Bacon, Kristina M; Shah, Mirat; Brown, Shawn T; Bailey, Rachel R; Laosiritaworn, Yongjua; Burke, Donald S; Cummings, Derek A T

    2011-05-01

    With several candidate dengue vaccines under development, this is an important time to help stakeholders (e.g., policy makers, scientists, clinicians, and manufacturers) better understand the potential economic value (cost-effectiveness) of a dengue vaccine, especially while vaccine characteristics and strategies might be readily altered. We developed a decision analytic Markov simulation model to evaluate the potential health and economic value of administering a dengue vaccine to an individual (≤ 1 year of age) in Thailand from the societal perspective. Sensitivity analyses evaluated the effects of ranging various vaccine (e.g., cost, efficacy, side effect), epidemiological (dengue risk), and disease (treatment-seeking behavior) characteristics. A ≥ 50% efficacious vaccine was highly cost-effective [GDP) ($4,289)] up to a total vaccination cost of $60 and cost-effective [GDP ($12,868)] up to a total vaccination cost of $200. When the total vaccine series was $1.50, many scenarios were cost saving.

  18. Weighted statistical parameters for irregularly sampled time series

    Science.gov (United States)

    Rimoldini, Lorenzo

    2014-01-01

    Unevenly spaced time series are common in astronomy because of the day-night cycle, weather conditions, dependence on the source position in the sky, allocated telescope time and corrupt measurements, for example, or inherent to the scanning law of satellites like Hipparcos and the forthcoming Gaia. Irregular sampling often causes clumps of measurements and gaps with no data which can severely disrupt the values of estimators. This paper aims at improving the accuracy of common statistical parameters when linear interpolation (in time or phase) can be considered an acceptable approximation of a deterministic signal. A pragmatic solution is formulated in terms of a simple weighting scheme, adapting to the sampling density and noise level, applicable to large data volumes at minimal computational cost. Tests on time series from the Hipparcos periodic catalogue led to significant improvements in the overall accuracy and precision of the estimators with respect to the unweighted counterparts and those weighted by inverse-squared uncertainties. Automated classification procedures employing statistical parameters weighted by the suggested scheme confirmed the benefits of the improved input attributes. The classification of eclipsing binaries, Mira, RR Lyrae, Delta Cephei and Alpha2 Canum Venaticorum stars employing exclusively weighted descriptive statistics achieved an overall accuracy of 92 per cent, about 6 per cent higher than with unweighted estimators.

  19. An Evaluation of American English File Series

    Directory of Open Access Journals (Sweden)

    Behnam Ghasemi

    2012-03-01

    Full Text Available Textbooks play a pivotal role in language learning classrooms. The problem is that among a wide range of textbooks in market which is appropriate for a specific classroom and a group of learners. In order to evaluate ELT textbooks theorists and writers have offered different kinds of evaluative frameworks based on a number of principles and criteria. This study evaluates a series of ELT textbook, namely, American English File by the use of Littlejohn’s (1998 evaluative framework to see what explicit features of the book are, what pedagogic values it has, whether it is in line with its claimed objectives, and what its merits and demerits are. Littlejohn believes that we should evaluate a textbook based on its own pedagogic values and we should see what is in it not what teacher and evaluators think must exist in it. Consequently his framework is claimed to be devoid of any impressionistic ideas and it is in-depth and objective rather than being subjective. Nine ELT experts and ten ELT teachers helped the researcher rate the evaluative checklists. The results of the study show that although a number of shortcomings and drawbacks were found in American English File, it stood up reasonably well to a detailed and in-depth analysis and that its pedagogic values and positive attributes far out-weighed its shortcomings. The internal consistency between ratings was computed via the statistical tool of Cronbach’s alpha that indicated a desirable inter-rater reliability.

  20. Models for dependent time series

    CERN Document Server

    Tunnicliffe Wilson, Granville; Haywood, John

    2015-01-01

    Models for Dependent Time Series addresses the issues that arise and the methodology that can be applied when the dependence between time series is described and modeled. Whether you work in the economic, physical, or life sciences, the book shows you how to draw meaningful, applicable, and statistically valid conclusions from multivariate (or vector) time series data.The first four chapters discuss the two main pillars of the subject that have been developed over the last 60 years: vector autoregressive modeling and multivariate spectral analysis. These chapters provide the foundational mater