WorldWideScience

Sample records for series analysis techniques

  1. Comparison of correlation analysis techniques for irregularly sampled time series

    Directory of Open Access Journals (Sweden)

    K. Rehfeld

    2011-06-01

    Full Text Available Geoscientific measurements often provide time series with irregular time sampling, requiring either data reconstruction (interpolation or sophisticated methods to handle irregular sampling. We compare the linear interpolation technique and different approaches for analyzing the correlation functions and persistence of irregularly sampled time series, as Lomb-Scargle Fourier transformation and kernel-based methods. In a thorough benchmark test we investigate the performance of these techniques.

    All methods have comparable root mean square errors (RMSEs for low skewness of the inter-observation time distribution. For high skewness, very irregular data, interpolation bias and RMSE increase strongly. We find a 40 % lower RMSE for the lag-1 autocorrelation function (ACF for the Gaussian kernel method vs. the linear interpolation scheme,in the analysis of highly irregular time series. For the cross correlation function (CCF the RMSE is then lower by 60 %. The application of the Lomb-Scargle technique gave results comparable to the kernel methods for the univariate, but poorer results in the bivariate case. Especially the high-frequency components of the signal, where classical methods show a strong bias in ACF and CCF magnitude, are preserved when using the kernel methods.

    We illustrate the performances of interpolation vs. Gaussian kernel method by applying both to paleo-data from four locations, reflecting late Holocene Asian monsoon variability as derived from speleothem δ18O measurements. Cross correlation results are similar for both methods, which we attribute to the long time scales of the common variability. The persistence time (memory is strongly overestimated when using the standard, interpolation-based, approach. Hence, the Gaussian kernel is a reliable and more robust estimator with significant advantages compared to other techniques and suitable for large scale application to paleo-data.

  2. Time-series-analysis techniques applied to nuclear-material accounting

    International Nuclear Information System (INIS)

    Pike, D.H.; Morrison, G.W.; Downing, D.J.

    1982-05-01

    This document is designed to introduce the reader to the applications of Time Series Analysis techniques to Nuclear Material Accountability data. Time series analysis techniques are designed to extract information from a collection of random variables ordered by time by seeking to identify any trends, patterns, or other structure in the series. Since nuclear material accountability data is a time series, one can extract more information using time series analysis techniques than by using other statistical techniques. Specifically, the objective of this document is to examine the applicability of time series analysis techniques to enhance loss detection of special nuclear materials. An introductory section examines the current industry approach which utilizes inventory differences. The error structure of inventory differences is presented. Time series analysis techniques discussed include the Shewhart Control Chart, the Cumulative Summation of Inventory Differences Statistics (CUSUM) and the Kalman Filter and Linear Smoother

  3. On-line diagnostic techniques for air-operated control valves based on time series analysis

    International Nuclear Information System (INIS)

    Ito, Kenji; Matsuoka, Yoshinori; Minamikawa, Shigeru; Komatsu, Yasuki; Satoh, Takeshi.

    1996-01-01

    The objective of this research is to study the feasibility of applying on-line diagnostic techniques based on time series analysis to air-operated control valves - numerous valves of the type which are used in PWR plants. Generally the techniques can detect anomalies by failures in the initial stages for which detection is difficult by conventional surveillance of process parameters measured directly. However, the effectiveness of these techniques depends on the system being diagnosed. The difficulties in applying diagnostic techniques to air-operated control valves seem to come from the reduced sensitivity of their response as compared with hydraulic control systems, as well as the need to identify anomalies in low level signals that fluctuate only slightly but continuously. In this research, simulation tests were performed by setting various kinds of failure modes for a test valve with the same specifications as of a valve actually used in the plants. Actual control signals recorded from an operating plant were then used as input signals for simulation. The results of the tests confirmed the feasibility of applying on-line diagnostic techniques based on time series analysis to air-operated control valves. (author)

  4. Investigation of interfacial wave structure using time-series analysis techniques

    International Nuclear Information System (INIS)

    Jayanti, S.; Hewitt, G.F.; Cliffe, K.A.

    1990-09-01

    The report presents an investigation into the interfacial structure in horizontal annular flow using spectral and time-series analysis techniques. Film thickness measured using conductance probes shows an interesting transition in wave pattern from a continuous low-frequency wave pattern to an intermittent, high-frequency one. From the autospectral density function of the film thickness, it appears that this transition is caused by the breaking up of long waves into smaller ones. To investigate the possibility of the wave structure being represented as a low order chaotic system, phase portraits of the time series were constructed using the technique developed by Broomhead and co-workers (1986, 1987 and 1989). These showed a banded structure when waves of relatively high frequency were filtered out. Although these results are encouraging, further work is needed to characterise the attractor. (Author)

  5. Possible signatures of dissipation from time-series analysis techniques using a turbulent laboratory magnetohydrodynamic plasma

    International Nuclear Information System (INIS)

    Schaffner, D. A.; Brown, M. R.; Rock, A. B.

    2016-01-01

    The frequency spectrum of magnetic fluctuations as measured on the Swarthmore Spheromak Experiment is broadband and exhibits a nearly Kolmogorov 5/3 scaling. It features a steepening region which is indicative of dissipation of magnetic fluctuation energy similar to that observed in fluid and magnetohydrodynamic turbulence systems. Two non-spectrum based time-series analysis techniques are implemented on this data set in order to seek other possible signatures of turbulent dissipation beyond just the steepening of fluctuation spectra. Presented here are results for the flatness, permutation entropy, and statistical complexity, each of which exhibits a particular character at spectral steepening scales which can then be compared to the behavior of the frequency spectrum.

  6. Time series analysis of wind speed using VAR and the generalized impulse response technique

    Energy Technology Data Exchange (ETDEWEB)

    Ewing, Bradley T. [Area of Information Systems and Quantitative Sciences, Rawls College of Business and Wind Science and Engineering Research Center, Texas Tech University, Lubbock, TX 79409-2101 (United States); Kruse, Jamie Brown [Center for Natural Hazard Research, East Carolina University, Greenville, NC (United States); Schroeder, John L. [Department of Geosciences and Wind Science and Engineering Research Center, Texas Tech University, Lubbock, TX (United States); Smith, Douglas A. [Department of Civil Engineering and Wind Science and Engineering Research Center, Texas Tech University, Lubbock, TX (United States)

    2007-03-15

    This research examines the interdependence in time series wind speed data measured in the same location at four different heights. A multiple-equation system known as a vector autoregression is proposed for characterizing the time series dynamics of wind. Additionally, the recently developed method of generalized impulse response analysis provides insight into the cross-effects of the wind series and their responses to shocks. Findings are based on analysis of contemporaneous wind speed time histories taken at 13, 33, 70 and 160 ft above ground level with a sampling rate of 10 Hz. The results indicate that wind speeds measured at 70 ft was the most variable. Further, the turbulence persisted longer at the 70-ft measurement than at the other heights. The greatest interdependence is observed at 13 ft. Gusts at 160 ft led to the greatest persistence to an 'own' shock and led to greatest persistence in the responses of the other wind series. (author)

  7. Visual time series analysis

    DEFF Research Database (Denmark)

    Fischer, Paul; Hilbert, Astrid

    2012-01-01

    We introduce a platform which supplies an easy-to-handle, interactive, extendable, and fast analysis tool for time series analysis. In contrast to other software suits like Maple, Matlab, or R, which use a command-line-like interface and where the user has to memorize/look-up the appropriate...... commands, our application is select-and-click-driven. It allows to derive many different sequences of deviations for a given time series and to visualize them in different ways in order to judge their expressive power and to reuse the procedure found. For many transformations or model-ts, the user may...... choose between manual and automated parameter selection. The user can dene new transformations and add them to the system. The application contains efficient implementations of advanced and recent techniques for time series analysis including techniques related to extreme value analysis and filtering...

  8. Time series analysis of reference crop evapotranspiration using soft computing techniques for Ganjam District, Odisha, India

    Science.gov (United States)

    Patra, S. R.

    2017-12-01

    Evapotranspiration (ET0) influences water resources and it is considered as a vital process in aridic hydrologic frameworks. It is one of the most important measure in finding the drought condition. Therefore, time series forecasting of evapotranspiration is very important in order to help the decision makers and water system mangers build up proper systems to sustain and manage water resources. Time series considers that -history repeats itself, hence by analysing the past values, better choices, or forecasts, can be carried out for the future. Ten years of ET0 data was used as a part of this study to make sure a satisfactory forecast of monthly values. In this study, three models: (ARIMA) mathematical model, artificial neural network model, support vector machine model are presented. These three models are used for forecasting monthly reference crop evapotranspiration based on ten years of past historical records (1991-2001) of measured evaporation at Ganjam region, Odisha, India without considering the climate data. The developed models will allow water resource managers to predict up to 12 months, making these predictions very useful to optimize the resources needed for effective water resources management. In this study multistep-ahead prediction is performed which is more complex and troublesome than onestep ahead. Our investigation proposed that nonlinear relationships may exist among the monthly indices, so that the ARIMA model might not be able to effectively extract the full relationship hidden in the historical data. Support vector machines are potentially helpful time series forecasting strategies on account of their strong nonlinear mapping capability and resistance to complexity in forecasting data. SVMs have great learning capability in time series modelling compared to ANN. For instance, the SVMs execute the structural risk minimization principle, which allows in better generalization as compared to neural networks that use the empirical risk

  9. Developing a complex independent component analysis technique to extract non-stationary patterns from geophysical time-series

    Science.gov (United States)

    Forootan, Ehsan; Kusche, Jürgen

    2016-04-01

    Geodetic/geophysical observations, such as the time series of global terrestrial water storage change or sea level and temperature change, represent samples of physical processes and therefore contain information about complex physical interactionswith many inherent time scales. Extracting relevant information from these samples, for example quantifying the seasonality of a physical process or its variability due to large-scale ocean-atmosphere interactions, is not possible by rendering simple time series approaches. In the last decades, decomposition techniques have found increasing interest for extracting patterns from geophysical observations. Traditionally, principal component analysis (PCA) and more recently independent component analysis (ICA) are common techniques to extract statistical orthogonal (uncorrelated) and independent modes that represent the maximum variance of observations, respectively. PCA and ICA can be classified as stationary signal decomposition techniques since they are based on decomposing the auto-covariance matrix or diagonalizing higher (than two)-order statistical tensors from centered time series. However, the stationary assumption is obviously not justifiable for many geophysical and climate variables even after removing cyclic components e.g., the seasonal cycles. In this paper, we present a new decomposition method, the complex independent component analysis (CICA, Forootan, PhD-2014), which can be applied to extract to non-stationary (changing in space and time) patterns from geophysical time series. Here, CICA is derived as an extension of real-valued ICA (Forootan and Kusche, JoG-2012), where we (i) define a new complex data set using a Hilbert transformation. The complex time series contain the observed values in their real part, and the temporal rate of variability in their imaginary part. (ii) An ICA algorithm based on diagonalization of fourth-order cumulants is then applied to decompose the new complex data set in (i

  10. Aerodynamic analysis of S series wind turbine airfoils by using X foil technique

    International Nuclear Information System (INIS)

    Zaheer, M.A.; Munir, M.A.; Zahid, I.; Rizwan, M.

    2015-01-01

    In order to attain supreme energy from wind turbine economically, blade profile enactment must be acquired. For extracting extreme power from wind, it is necessary to develop rotor models of wind turbine which have high rotation rates and power coefficients. Maximum power can also be haul out by using suitable airfoils at root and tip sections of wind turbine blades. In this research four different S-series airfoils have been selected to study their behavior for maximum power extraction from wind. The wind conditions during the research were scertained from the wind speeds over Kallar Kahar Pakistan. In order to study the wind turbine operation, the extremely important parameters are lift and drag forces. Therefore an endeavor to study lift force and drag force at various sections of wind turbine blade is shown in current research. In order to acquire the utmost power from wind turbine, highest value of sliding ratio is prerequisite. At various wind speeds, performance of several blade profiles was analyzed and for every wind speed, the appropriate blade profile is ascertained grounded on the utmost sliding ratio. For every airfoil, prime angle of attack is resolute at numerous wind speeds. (author)

  11. A comparative analysis of spectral exponent estimation techniques for 1/f(β) processes with applications to the analysis of stride interval time series.

    Science.gov (United States)

    Schaefer, Alexander; Brach, Jennifer S; Perera, Subashan; Sejdić, Ervin

    2014-01-30

    The time evolution and complex interactions of many nonlinear systems, such as in the human body, result in fractal types of parameter outcomes that exhibit self similarity over long time scales by a power law in the frequency spectrum S(f)=1/f(β). The scaling exponent β is thus often interpreted as a "biomarker" of relative health and decline. This paper presents a thorough comparative numerical analysis of fractal characterization techniques with specific consideration given to experimentally measured gait stride interval time series. The ideal fractal signals generated in the numerical analysis are constrained under varying lengths and biases indicative of a range of physiologically conceivable fractal signals. This analysis is to complement previous investigations of fractal characteristics in healthy and pathological gait stride interval time series, with which this study is compared. The results of our analysis showed that the averaged wavelet coefficient method consistently yielded the most accurate results. Class dependent methods proved to be unsuitable for physiological time series. Detrended fluctuation analysis as most prevailing method in the literature exhibited large estimation variances. The comparative numerical analysis and experimental applications provide a thorough basis for determining an appropriate and robust method for measuring and comparing a physiologically meaningful biomarker, the spectral index β. In consideration of the constraints of application, we note the significant drawbacks of detrended fluctuation analysis and conclude that the averaged wavelet coefficient method can provide reasonable consistency and accuracy for characterizing these fractal time series. Copyright © 2013 Elsevier B.V. All rights reserved.

  12. Statistical techniques applied to aerial radiometric surveys (STAARS): series introduction and the principal-components-analysis method

    International Nuclear Information System (INIS)

    Pirkle, F.L.

    1981-04-01

    STAARS is a new series which is being published to disseminate information concerning statistical procedures for interpreting aerial radiometric data. The application of a particular data interpretation technique to geologic understanding for delineating regions favorable to uranium deposition is the primary concern of STAARS. Statements concerning the utility of a technique on aerial reconnaissance data as well as detailed aerial survey data will be included

  13. A comparative analysis of spectral exponent estimation techniques for 1/fβ processes with applications to the analysis of stride interval time series

    Science.gov (United States)

    Schaefer, Alexander; Brach, Jennifer S.; Perera, Subashan; Sejdić, Ervin

    2013-01-01

    Background The time evolution and complex interactions of many nonlinear systems, such as in the human body, result in fractal types of parameter outcomes that exhibit self similarity over long time scales by a power law in the frequency spectrum S(f) = 1/fβ. The scaling exponent β is thus often interpreted as a “biomarker” of relative health and decline. New Method This paper presents a thorough comparative numerical analysis of fractal characterization techniques with specific consideration given to experimentally measured gait stride interval time series. The ideal fractal signals generated in the numerical analysis are constrained under varying lengths and biases indicative of a range of physiologically conceivable fractal signals. This analysis is to complement previous investigations of fractal characteristics in healthy and pathological gait stride interval time series, with which this study is compared. Results The results of our analysis showed that the averaged wavelet coefficient method consistently yielded the most accurate results. Comparison with Existing Methods: Class dependent methods proved to be unsuitable for physiological time series. Detrended fluctuation analysis as most prevailing method in the literature exhibited large estimation variances. Conclusions The comparative numerical analysis and experimental applications provide a thorough basis for determining an appropriate and robust method for measuring and comparing a physiologically meaningful biomarker, the spectral index β. In consideration of the constraints of application, we note the significant drawbacks of detrended fluctuation analysis and conclude that the averaged wavelet coefficient method can provide reasonable consistency and accuracy for characterizing these fractal time series. PMID:24200509

  14. Applied time series analysis

    CERN Document Server

    Woodward, Wayne A; Elliott, Alan C

    2011-01-01

    ""There is scarcely a standard technique that the reader will find left out … this book is highly recommended for those requiring a ready introduction to applicable methods in time series and serves as a useful resource for pedagogical purposes.""-International Statistical Review (2014), 82""Current time series theory for practice is well summarized in this book.""-Emmanuel Parzen, Texas A&M University""What an extraordinary range of topics covered, all very insightfully. I like [the authors'] innovations very much, such as the AR factor table.""-David Findley, U.S. Census Bureau (retired)""…

  15. Time series prediction: statistical and neural techniques

    Science.gov (United States)

    Zahirniak, Daniel R.; DeSimio, Martin P.

    1996-03-01

    In this paper we compare the performance of nonlinear neural network techniques to those of linear filtering techniques in the prediction of time series. Specifically, we compare the results of using the nonlinear systems, known as multilayer perceptron and radial basis function neural networks, with the results obtained using the conventional linear Wiener filter, Kalman filter and Widrow-Hoff adaptive filter in predicting future values of stationary and non- stationary time series. Our results indicate the performance of each type of system is heavily dependent upon the form of the time series being predicted and the size of the system used. In particular, the linear filters perform adequately for linear or near linear processes while the nonlinear systems perform better for nonlinear processes. Since the linear systems take much less time to be developed, they should be tried prior to using the nonlinear systems when the linearity properties of the time series process are unknown.

  16. Time Series Analysis and Forecasting by Example

    CERN Document Server

    Bisgaard, Soren

    2011-01-01

    An intuition-based approach enables you to master time series analysis with ease Time Series Analysis and Forecasting by Example provides the fundamental techniques in time series analysis using various examples. By introducing necessary theory through examples that showcase the discussed topics, the authors successfully help readers develop an intuitive understanding of seemingly complicated time series models and their implications. The book presents methodologies for time series analysis in a simplified, example-based approach. Using graphics, the authors discuss each presented example in

  17. [Liver transplantation--indications, surgical technique, results--the analysis of a clinical series of 200 cases].

    Science.gov (United States)

    Popescu, I; Ionescu, M; Braşoveanu, V; Hrehoreţ, D; Matei, E; Dorobantu, B; Zamfir, R; Alexandrescu, S; Grigorie, M; Tulbure, D; Popa, L; Ungureanu, M; Tomescu, D; Droc, G; Popescu, H; Cristea, A; Gheorghe, L; Iacob, S; Gheorghe, C; Boroş, M; Lupescu, I; Vlad, L; Herlea, V; Croitoru, M; Platon, P; Alloub, A

    2010-01-01

    Initially considered experimental, liver transplantation (LT) has become the treatment of choice for the patients with end-stage liver diseases. Between April 2000 and October 2009, 200 LTs (10 reLTs) were performed in 190 patients, this study being retrospective. There were transplanted 110 men and 80 women, 159 adults and 31 children with the age between 1 and 64 years old (mean age--39.9). The main indication in the adult group was represented by viral cirrhosis, while the pediatric series the etiology was mainly glycogenosis and biliary atresia. There were performed 143 whole graft LTs, 46 living donor LTs, 6 split LTs, 4 reduced LTs and one domino LT RESULTS: The postoperative survival was 90% (170 patients). The patient and graft one-year and five-year survivals were 76.9%, 73.6% and 71%, 68.2%, respectively. The early complications occurred in 127 patients (67%). The late complications were recorded in 71 patients (37.3%). The intraoperative and early postoperative mortality rate was 9.5% (18 patients). The Romanian liver transplantation program from Fundeni includes all types of current surgical techniques and the results are comparable with those from other international centers.

  18. Analysis of trend in temperature and rainfall time series of an Indian arid region: comparative evaluation of salient techniques

    Science.gov (United States)

    Machiwal, Deepesh; Gupta, Ankit; Jha, Madan Kumar; Kamble, Trupti

    2018-04-01

    This study investigated trends in 35 years (1979-2013) temperature (maximum, Tmax and minimum, Tmin) and rainfall at annual and seasonal (pre-monsoon, monsoon, post-monsoon, and winter) scales for 31 grid points in a coastal arid region of India. Box-whisker plots of annual temperature and rainfall time series depict systematic spatial gradients. Trends were examined by applying eight tests, such as Kendall rank correlation (KRC), Spearman rank order correlation (SROC), Mann-Kendall (MK), four modified MK tests, and innovative trend analysis (ITA). Trend magnitudes were quantified by Sen's slope estimator, and a new method was adopted to assess the significance of linear trends in MK-test statistics. It was found that the significant serial correlation is prominent in the annual and post-monsoon Tmax and Tmin, and pre-monsoon Tmin. The KRC and MK tests yielded similar results in close resemblance with the SROC test. The performance of two modified MK tests considering variance-correction approaches was found superior to the KRC, MK, modified MK with pre-whitening, and ITA tests. The performance of original MK test is poor due to the presence of serial correlation, whereas the ITA method is over-sensitive in identifying trends. Significantly increasing trends are more prominent in Tmin than Tmax. Further, both the annual and monsoon rainfall time series have a significantly increasing trend of 9 mm year-1. The sequential significance of linear trend in MK test-statistics is very strong (R 2 ≥ 0.90) in the annual and pre-monsoon Tmin (90% grid points), and strong (R 2 ≥ 0.75) in monsoon Tmax (68% grid points), monsoon, post-monsoon, and winter Tmin (respectively 65, 55, and 48% grid points), as well as in the annual and monsoon rainfalls (respectively 68 and 61% grid points). Finally, this study recommends use of variance-corrected MK test for the precise identification of trends. It is emphasized that the rising Tmax may hamper crop growth due to enhanced

  19. The foundations of modern time series analysis

    CERN Document Server

    Mills, Terence C

    2011-01-01

    This book develops the analysis of Time Series from its formal beginnings in the 1890s through to the publication of Box and Jenkins' watershed publication in 1970, showing how these methods laid the foundations for the modern techniques of Time Series analysis that are in use today.

  20. Time series analysis time series analysis methods and applications

    CERN Document Server

    Rao, Tata Subba; Rao, C R

    2012-01-01

    The field of statistics not only affects all areas of scientific activity, but also many other matters such as public policy. It is branching rapidly into so many different subjects that a series of handbooks is the only way of comprehensively presenting the various aspects of statistical methodology, applications, and recent developments. The Handbook of Statistics is a series of self-contained reference books. Each volume is devoted to a particular topic in statistics, with Volume 30 dealing with time series. The series is addressed to the entire community of statisticians and scientists in various disciplines who use statistical methodology in their work. At the same time, special emphasis is placed on applications-oriented techniques, with the applied statistician in mind as the primary audience. Comprehensively presents the various aspects of statistical methodology Discusses a wide variety of diverse applications and recent developments Contributors are internationally renowened experts in their respect...

  1. Introduction to time series analysis and forecasting

    CERN Document Server

    Montgomery, Douglas C; Kulahci, Murat

    2008-01-01

    An accessible introduction to the most current thinking in and practicality of forecasting techniques in the context of time-oriented data. Analyzing time-oriented data and forecasting are among the most important problems that analysts face across many fields, ranging from finance and economics to production operations and the natural sciences. As a result, there is a widespread need for large groups of people in a variety of fields to understand the basic concepts of time series analysis and forecasting. Introduction to Time Series Analysis and Forecasting presents the time series analysis branch of applied statistics as the underlying methodology for developing practical forecasts, and it also bridges the gap between theory and practice by equipping readers with the tools needed to analyze time-oriented data and construct useful, short- to medium-term, statistically based forecasts.

  2. What marketing scholars should know about time series analysis : time series applications in marketing

    NARCIS (Netherlands)

    Horváth, Csilla; Kornelis, Marcel; Leeflang, Peter S.H.

    2002-01-01

    In this review, we give a comprehensive summary of time series techniques in marketing, and discuss a variety of time series analysis (TSA) techniques and models. We classify them in the sets (i) univariate TSA, (ii) multivariate TSA, and (iii) multiple TSA. We provide relevant marketing

  3. Adsorption of proteins from plasma to a series of hydrophilic-hydrophobic copolymers. I. Analysis with the in situ radioiodination technique

    International Nuclear Information System (INIS)

    Horbett, T.A.; Weathersby, P.K.

    1981-01-01

    The adsorption of proteins affects cellular interactions with foreign surfaces and thus plays an important role in determining the biocompatibility of implants. Previous studies have indicated differences in the affinity of various proteins for a given polymer, and differences in the affinity of fibrinogen for a series of polymers varying in hydrophilicity. These studies suggest that differences in the composition of the protein layer adsorbed to polymers from plasma might exist. To examine this hypothesis, the proteins adsorbed from plasma to a series of polymers varying in hydrophilicity were analyzed with the iodogram technique. Copolymers of hydroxyethyl methacrylate and ethyl methacrylate made by the radiation grafting technique were exposed to plasma for 0.5 or 150 min. The adsorbed proteins were iodinated, eluted with SDS, and separated with polyacrylamide gel electrophoresis. Fibrinogen, immunoglobulin G, hemoglobin, and a peak tentatively ascribed to prothrombin were the major proteins detected. Very little iodine was incorporated into adsorbed albumin, even though it was shown to be present by a separate experiment using dye binding. The fraction of total radioactivity associated with each of nine proteins was found to vary markedly and systematically among the surfaces. The distribution of radioactivity into the proteins was very different on 0.5 and 150-min plasma exposed polymers. The results reflect both compositional differences in the adsorbed protein layer on the polymers and differences in the accessibility of proteins to the labeling reagent in the adsorbed state. Differences in the organization of the adsorbed protein layer may play a key role in determining whether cell surface receptors can come in contact with the specific plasma protein able to further stimulate the cell

  4. Highly comparative time-series analysis: the empirical structure of time series and their methods.

    Science.gov (United States)

    Fulcher, Ben D; Little, Max A; Jones, Nick S

    2013-06-06

    The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines.

  5. Time-Series Analysis: A Cautionary Tale

    Science.gov (United States)

    Damadeo, Robert

    2015-01-01

    Time-series analysis has often been a useful tool in atmospheric science for deriving long-term trends in various atmospherically important parameters (e.g., temperature or the concentration of trace gas species). In particular, time-series analysis has been repeatedly applied to satellite datasets in order to derive the long-term trends in stratospheric ozone, which is a critical atmospheric constituent. However, many of the potential pitfalls relating to the non-uniform sampling of the datasets were often ignored and the results presented by the scientific community have been unknowingly biased. A newly developed and more robust application of this technique is applied to the Stratospheric Aerosol and Gas Experiment (SAGE) II version 7.0 ozone dataset and the previous biases and newly derived trends are presented.

  6. Methods for removal of unwanted signals from gravity time-series: Comparison using linear techniques complemented with analysis of system dynamics

    Science.gov (United States)

    Valencio, Arthur; Grebogi, Celso; Baptista, Murilo S.

    2017-10-01

    The presence of undesirable dominating signals in geophysical experimental data is a challenge in many subfields. One remarkable example is surface gravimetry, where frequencies from Earth tides correspond to time-series fluctuations up to a thousand times larger than the phenomena of major interest, such as hydrological gravity effects or co-seismic gravity changes. This work discusses general methods for the removal of unwanted dominating signals by applying them to 8 long-period gravity time-series of the International Geodynamics and Earth Tides Service, equivalent to the acquisition from 8 instruments in 5 locations representative of the network. We compare three different conceptual approaches for tide removal: frequency filtering, physical modelling, and data-based modelling. Each approach reveals a different limitation to be considered depending on the intended application. Vestiges of tides remain in the residues for the modelling procedures, whereas the signal was distorted in different ways by the filtering and data-based procedures. The linear techniques employed were power spectral density, spectrogram, cross-correlation, and classical harmonics decomposition, while the system dynamics was analysed by state-space reconstruction and estimation of the largest Lyapunov exponent. Although the tides could not be completely eliminated, they were sufficiently reduced to allow observation of geophysical events of interest above the 10 nm s-2 level, exemplified by a hydrology-related event of 60 nm s-2. The implementations adopted for each conceptual approach are general, so that their principles could be applied to other kinds of data affected by undesired signals composed mainly by periodic or quasi-periodic components.

  7. AGARD Flight Test Techniques Series. Volume 9. Aircraft Exterior Noise Measurement and Analysis Techniques. (Le Bruit a l’Exterieur des Aeronefs: Techniques de Mesure et d’Analyse)

    Science.gov (United States)

    1991-04-01

    sugeseed to me to write ui AGARDograpit on A~rlmaft Noie Mms dsurnent Anallysis Techniques’. Being overjoyed, and quite honoured. I realdily agreed to his...Gelt& I )nd Delta 2 terms) Wb) Source Noise Correction - Jet Engine Noise ’) ielts 3 term) (c) Snor"e Noise Correction - Propeller Noise (Delta 3...printed out, since it is impractical to write these down by hand durilg th,. test). One track on each tape-recorder must be used to record a time code

  8. The analysis of time series: an introduction

    National Research Council Canada - National Science Library

    Chatfield, Christopher

    1989-01-01

    .... A variety of practical examples are given to support the theory. The book covers a wide range of time-series topics, including probability models for time series, Box-Jenkins forecasting, spectral analysis, linear systems and system identification...

  9. A Course in Time Series Analysis

    CERN Document Server

    Peña, Daniel; Tsay, Ruey S

    2011-01-01

    New statistical methods and future directions of research in time series A Course in Time Series Analysis demonstrates how to build time series models for univariate and multivariate time series data. It brings together material previously available only in the professional literature and presents a unified view of the most advanced procedures available for time series model building. The authors begin with basic concepts in univariate time series, providing an up-to-date presentation of ARIMA models, including the Kalman filter, outlier analysis, automatic methods for building ARIMA models, a

  10. Analysis of archaeological pieces with nuclear techniques

    International Nuclear Information System (INIS)

    Tenorio, D.

    2002-01-01

    In this work nuclear techniques such as Neutron Activation Analysis, PIXE, X-ray fluorescence analysis, Metallography, Uranium series, Rutherford Backscattering for using in analysis of archaeological specimens and materials are described. Also some published works and thesis about analysis of different Mexican and Meso american archaeological sites are referred. (Author)

  11. ESTUDIO DE SERIES TEMPORALES DE CONTAMINACIÓN AMBIENTAL MEDIANTE TÉCNICAS DE REDES NEURONALES ARTIFICIALES TIME SERIES ANALYSIS OF ATMOSPHERE POLLUTION DATA USING ARTIFICIAL NEURAL NETWORKS TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Giovanni Salini Calderón

    2006-12-01

    Full Text Available Se diseñó una red neuronal artificial (RNA para hacer predicciones de valores de concentraciones horarias de material particulado fino en la atmósfera. El estudio está basado en los datos de tres años de series de tiempo de pm2.5 (material particulado suspendido de 2,5 micrones de diámetro, obtenidos en una estación céntrica de la Red MACAM de la ciudad de Santiago de Chile, entre los años 1994 y 1996. Para obtener el espaciamiento óptimo de los datos, así como el número de datos hacia atrás necesarios para pronosticar el valor futuro, se aplicaron dos test estándar usados en estudio de sistemas dinámicos, como Información Mutua promedio (AMI y Falsos Vecinos más Cercanos (FNN. De esta manera se encontró que lo más conveniente era considerar como entrada los datos de PM2.5 cada seis horas durante un día (cuatro datos, y en base a ellos predecir el dato siguiente. Una vez fijo el número de variables de entrada y elegida la variable a pronosticar, se diseñó un modelo predictivo basado en la técnica de RNA. El tipo de modelo de RNA usado fue uno de multicapas, alimentado hacia adelante y entrenado mediante la técnica de propagación hacia atrás. Se probaron redes sin capa oculta y con una y dos capas ocultas. El mejor modelo resultó ser con una capa oculta, a diferencia de lo obtenido en trabajo anterior que reportaba que la red sin capa oculta era más eficiente. Los resultados fueron más precisos que los obtenidos con un modelo de persistencia (el valor en seis horas más será el mismo que el actual.An artificial neural network for the forecasting of concentrations of fine particulate matter in the atmosphere was designed. The data set analyzed corresponds to three years of pm2.5 time series (particulate matter in suspension with aerodynamic diameter less than 2,5 microns, measured in a station that belongs to Santiago's monitoring network (Red MACAM and is located near downtown. We consider measurements of

  12. INTERNAL ENVIRONMENT ANALYSIS TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Caescu Stefan Claudiu

    2011-12-01

    Full Text Available Theme The situation analysis, as a separate component of the strategic planning, involves collecting and analysing relevant types of information on the components of the marketing environment and their evolution on the one hand and also on the organization’s resources and capabilities on the other. Objectives of the Research The main purpose of the study of the analysis techniques of the internal environment is to provide insight on those aspects that are of strategic importance to the organization. Literature Review The marketing environment consists of two distinct components, the internal environment that is made from specific variables within the organization and the external environment that is made from variables external to the organization. Although analysing the external environment is essential for corporate success, it is not enough unless it is backed by a detailed analysis of the internal environment of the organization. The internal environment includes all elements that are endogenous to the organization, which are influenced to a great extent and totally controlled by it. The study of the internal environment must answer all resource related questions, solve all resource management issues and represents the first step in drawing up the marketing strategy. Research Methodology The present paper accomplished a documentary study of the main techniques used for the analysis of the internal environment. Results The special literature emphasizes that the differences in performance from one organization to another is primarily dependant not on the differences between the fields of activity, but especially on the differences between the resources and capabilities and the ways these are capitalized on. The main methods of analysing the internal environment addressed in this paper are: the analysis of the organizational resources, the performance analysis, the value chain analysis and the functional analysis. Implications Basically such

  13. The Photoplethismographic Signal Processed with Nonlinear Time Series Analysis Tools

    International Nuclear Information System (INIS)

    Hernandez Caceres, Jose Luis; Hong, Rolando; Garcia Lanz, Abel; Garcia Dominguez, Luis; Cabannas, Karelia

    2001-01-01

    Finger photoplethismography (PPG) signals were submitted to nonlinear time series analysis. The applied analytical techniques were: (i) High degree polynomial fitting for baseline estimation; (ii) FFT analysis for estimating power spectra; (iii) fractal dimension estimation via the Higuchi's time-domain method, and (iv) kernel nonparametric estimation for reconstructing noise free-attractors and also for estimating signal's stochastic components

  14. Analysis of series resonant converter with series-parallel connection

    Science.gov (United States)

    Lin, Bor-Ren; Huang, Chien-Lan

    2011-02-01

    In this study, a parallel inductor-inductor-capacitor (LLC) resonant converter series-connected on the primary side and parallel-connected on the secondary side is presented for server power supply systems. Based on series resonant behaviour, the power metal-oxide-semiconductor field-effect transistors are turned on at zero voltage switching and the rectifier diodes are turned off at zero current switching. Thus, the switching losses on the power semiconductors are reduced. In the proposed converter, the primary windings of the two LLC converters are connected in series. Thus, the two converters have the same primary currents to ensure that they can supply the balance load current. On the output side, two LLC converters are connected in parallel to share the load current and to reduce the current stress on the secondary windings and the rectifier diodes. In this article, the principle of operation, steady-state analysis and design considerations of the proposed converter are provided and discussed. Experiments with a laboratory prototype with a 24 V/21 A output for server power supply were performed to verify the effectiveness of the proposed converter.

  15. Analysis of Heavy-Tailed Time Series

    DEFF Research Database (Denmark)

    Xie, Xiaolei

    This thesis is about analysis of heavy-tailed time series. We discuss tail properties of real-world equity return series and investigate the possibility that a single tail index is shared by all return series of actively traded equities in a market. Conditions for this hypothesis to be true...... are identified. We study the eigenvalues and eigenvectors of sample covariance and sample auto-covariance matrices of multivariate heavy-tailed time series, and particularly for time series with very high dimensions. Asymptotic approximations of the eigenvalues and eigenvectors of such matrices are found...... and expressed in terms of the parameters of the dependence structure, among others. Furthermore, we study an importance sampling method for estimating rare-event probabilities of multivariate heavy-tailed time series generated by matrix recursion. We show that the proposed algorithm is efficient in the sense...

  16. Decision Analysis Technique

    Directory of Open Access Journals (Sweden)

    Hammad Dabo Baba

    2014-01-01

    Full Text Available One of the most significant step in building structure maintenance decision is the physical inspection of the facility to be maintained. The physical inspection involved cursory assessment of the structure and ratings of the identified defects based on expert evaluation. The objective of this paper is to describe present a novel approach to prioritizing the criticality of physical defects in a residential building system using multi criteria decision analysis approach. A residential building constructed in 1985 was considered in this study. Four criteria which includes; Physical Condition of the building system (PC, Effect on Asset (EA, effect on Occupants (EO and Maintenance Cost (MC are considered in the inspection. The building was divided in to nine systems regarded as alternatives. Expert's choice software was used in comparing the importance of the criteria against the main objective, whereas structured Proforma was used in quantifying the defects observed on all building systems against each criteria. The defects severity score of each building system was identified and later multiplied by the weight of the criteria and final hierarchy was derived. The final ranking indicates that, electrical system was considered the most critical system with a risk value of 0.134 while ceiling system scored the lowest risk value of 0.066. The technique is often used in prioritizing mechanical equipment for maintenance planning. However, result of this study indicates that the technique could be used in prioritizing building systems for maintenance planning

  17. Transition Icons for Time-Series Visualization and Exploratory Analysis.

    Science.gov (United States)

    Nickerson, Paul V; Baharloo, Raheleh; Wanigatunga, Amal A; Manini, Todd M; Tighe, Patrick J; Rashidi, Parisa

    2018-03-01

    The modern healthcare landscape has seen the rapid emergence of techniques and devices that temporally monitor and record physiological signals. The prevalence of time-series data within the healthcare field necessitates the development of methods that can analyze the data in order to draw meaningful conclusions. Time-series behavior is notoriously difficult to intuitively understand due to its intrinsic high-dimensionality, which is compounded in the case of analyzing groups of time series collected from different patients. Our framework, which we call transition icons, renders common patterns in a visual format useful for understanding the shared behavior within groups of time series. Transition icons are adept at detecting and displaying subtle differences and similarities, e.g., between measurements taken from patients receiving different treatment strategies or stratified by demographics. We introduce various methods that collectively allow for exploratory analysis of groups of time series, while being free of distribution assumptions and including simple heuristics for parameter determination. Our technique extracts discrete transition patterns from symbolic aggregate approXimation representations, and compiles transition frequencies into a bag of patterns constructed for each group. These transition frequencies are normalized and aligned in icon form to intuitively display the underlying patterns. We demonstrate the transition icon technique for two time-series datasets-postoperative pain scores, and hip-worn accelerometer activity counts. We believe transition icons can be an important tool for researchers approaching time-series data, as they give rich and intuitive information about collective time-series behaviors.

  18. Applied time series analysis and innovative computing

    CERN Document Server

    Ao, Sio-Iong

    2010-01-01

    This text is a systematic, state-of-the-art introduction to the use of innovative computing paradigms as an investigative tool for applications in time series analysis. It includes frontier case studies based on recent research.

  19. Time Series Analysis Forecasting and Control

    CERN Document Server

    Box, George E P; Reinsel, Gregory C

    2011-01-01

    A modernized new edition of one of the most trusted books on time series analysis. Since publication of the first edition in 1970, Time Series Analysis has served as one of the most influential and prominent works on the subject. This new edition maintains its balanced presentation of the tools for modeling and analyzing time series and also introduces the latest developments that have occurred n the field over the past decade through applications from areas such as business, finance, and engineering. The Fourth Edition provides a clearly written exploration of the key methods for building, cl

  20. Uncertainty analysis techniques

    International Nuclear Information System (INIS)

    Marivoet, J.; Saltelli, A.; Cadelli, N.

    1987-01-01

    The origin of the uncertainty affecting Performance Assessments, as well as their propagation to dose and risk results is discussed. The analysis is focused essentially on the uncertainties introduced by the input parameters, the values of which may range over some orders of magnitude and may be given as probability distribution function. The paper briefly reviews the existing sampling techniques used for Monte Carlo simulations and the methods for characterizing the output curves, determining their convergence and confidence limits. Annual doses, expectation values of the doses and risks are computed for a particular case of a possible repository in clay, in order to illustrate the significance of such output characteristics as the mean, the logarithmic mean and the median as well as their ratios. The report concludes that provisionally, due to its better robustness, such estimation as the 90th percentile may be substituted to the arithmetic mean for comparison of the estimated doses with acceptance criteria. In any case, the results obtained through Uncertainty Analyses must be interpreted with caution as long as input data distribution functions are not derived from experiments reasonably reproducing the situation in a well characterized repository and site

  1. Visibility Graph Based Time Series Analysis.

    Science.gov (United States)

    Stephen, Mutua; Gu, Changgui; Yang, Huijie

    2015-01-01

    Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it's microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq) and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks.

  2. Visibility Graph Based Time Series Analysis.

    Directory of Open Access Journals (Sweden)

    Mutua Stephen

    Full Text Available Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it's microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks.

  3. Time series analysis of temporal networks

    Science.gov (United States)

    Sikdar, Sandipan; Ganguly, Niloy; Mukherjee, Animesh

    2016-01-01

    A common but an important feature of all real-world networks is that they are temporal in nature, i.e., the network structure changes over time. Due to this dynamic nature, it becomes difficult to propose suitable growth models that can explain the various important characteristic properties of these networks. In fact, in many application oriented studies only knowing these properties is sufficient. For instance, if one wishes to launch a targeted attack on a network, this can be done even without the knowledge of the full network structure; rather an estimate of some of the properties is sufficient enough to launch the attack. We, in this paper show that even if the network structure at a future time point is not available one can still manage to estimate its properties. We propose a novel method to map a temporal network to a set of time series instances, analyze them and using a standard forecast model of time series, try to predict the properties of a temporal network at a later time instance. To our aim, we consider eight properties such as number of active nodes, average degree, clustering coefficient etc. and apply our prediction framework on them. We mainly focus on the temporal network of human face-to-face contacts and observe that it represents a stochastic process with memory that can be modeled as Auto-Regressive-Integrated-Moving-Average (ARIMA). We use cross validation techniques to find the percentage accuracy of our predictions. An important observation is that the frequency domain properties of the time series obtained from spectrogram analysis could be used to refine the prediction framework by identifying beforehand the cases where the error in prediction is likely to be high. This leads to an improvement of 7.96% (for error level ≤20%) in prediction accuracy on an average across all datasets. As an application we show how such prediction scheme can be used to launch targeted attacks on temporal networks. Contribution to the Topical Issue

  4. Identification of human operator performance models utilizing time series analysis

    Science.gov (United States)

    Holden, F. M.; Shinners, S. M.

    1973-01-01

    The results of an effort performed by Sperry Systems Management Division for AMRL in applying time series analysis as a tool for modeling the human operator are presented. This technique is utilized for determining the variation of the human transfer function under various levels of stress. The human operator's model is determined based on actual input and output data from a tracking experiment.

  5. Allan deviation analysis of financial return series

    Science.gov (United States)

    Hernández-Pérez, R.

    2012-05-01

    We perform a scaling analysis for the return series of different financial assets applying the Allan deviation (ADEV), which is used in the time and frequency metrology to characterize quantitatively the stability of frequency standards since it has demonstrated to be a robust quantity to analyze fluctuations of non-stationary time series for different observation intervals. The data used are opening price daily series for assets from different markets during a time span of around ten years. We found that the ADEV results for the return series at short scales resemble those expected for an uncorrelated series, consistent with the efficient market hypothesis. On the other hand, the ADEV results for absolute return series for short scales (first one or two decades) decrease following approximately a scaling relation up to a point that is different for almost each asset, after which the ADEV deviates from scaling, which suggests that the presence of clustering, long-range dependence and non-stationarity signatures in the series drive the results for large observation intervals.

  6. Early phase drug discovery: cheminformatics and computational techniques in identifying lead series.

    Science.gov (United States)

    Duffy, Bryan C; Zhu, Lei; Decornez, Hélène; Kitchen, Douglas B

    2012-09-15

    Early drug discovery processes rely on hit finding procedures followed by extensive experimental confirmation in order to select high priority hit series which then undergo further scrutiny in hit-to-lead studies. The experimental cost and the risk associated with poor selection of lead series can be greatly reduced by the use of many different computational and cheminformatic techniques to sort and prioritize compounds. We describe the steps in typical hit identification and hit-to-lead programs and then describe how cheminformatic analysis assists this process. In particular, scaffold analysis, clustering and property calculations assist in the design of high-throughput screening libraries, the early analysis of hits and then organizing compounds into series for their progression from hits to leads. Additionally, these computational tools can be used in virtual screening to design hit-finding libraries and as procedures to help with early SAR exploration. Copyright © 2012 Elsevier Ltd. All rights reserved.

  7. Entropic Analysis of Electromyography Time Series

    Science.gov (United States)

    Kaufman, Miron; Sung, Paul

    2005-03-01

    We are in the process of assessing the effectiveness of fractal and entropic measures for the diagnostic of low back pain from surface electromyography (EMG) time series. Surface electromyography (EMG) is used to assess patients with low back pain. In a typical EMG measurement, the voltage is measured every millisecond. We observed back muscle fatiguing during one minute, which results in a time series with 60,000 entries. We characterize the complexity of time series by computing the Shannon entropy time dependence. The analysis of the time series from different relevant muscles from healthy and low back pain (LBP) individuals provides evidence that the level of variability of back muscle activities is much larger for healthy individuals than for individuals with LBP. In general the time dependence of the entropy shows a crossover from a diffusive regime to a regime characterized by long time correlations (self organization) at about 0.01s.

  8. Introduction to time series analysis and forecasting

    CERN Document Server

    Montgomery, Douglas C; Kulahci, Murat

    2015-01-01

    Praise for the First Edition ""…[t]he book is great for readers who need to apply the methods and models presented but have little background in mathematics and statistics."" -MAA Reviews Thoroughly updated throughout, Introduction to Time Series Analysis and Forecasting, Second Edition presents the underlying theories of time series analysis that are needed to analyze time-oriented data and construct real-world short- to medium-term statistical forecasts.    Authored by highly-experienced academics and professionals in engineering statistics, the Second Edition features discussions on both

  9. Handbook of Time Series Analysis Recent Theoretical Developments and Applications

    CERN Document Server

    Schelter, Björn; Timmer, Jens

    2006-01-01

    This handbook provides an up-to-date survey of current research topics and applications of time series analysis methods written by leading experts in their fields. It covers recent developments in univariate as well as bivariate and multivariate time series analysis techniques ranging from physics' to life sciences' applications. Each chapter comprises both methodological aspects and applications to real world complex systems, such as the human brain or Earth's climate. Covering an exceptionally broad spectrum of topics, beginners, experts and practitioners who seek to understand the latest de

  10. Analysis and analytical techniques

    Energy Technology Data Exchange (ETDEWEB)

    Batuecas Rodriguez, T [Department of Chemistry and Isotopes, Junta de Energia Nuclear, Madrid (Spain)

    1967-01-01

    The technology associated with the use of organic coolants in nuclear reactors depends to a large extent on the determination and control of their physical and chemical properties, and particularly on the viability, speed, sensitivity, precision and accuracy (depending on the intended usage) of the methods employed in detection and analytical determination. This has led to the study and development of numerous techniques, some specially designed for the extreme conditions involved in working with the types of product in question and others adapted from existing techniques. In the specific case of polyphenyl and hydropolyphenyl mixtures, which have been the principal subjects of study to date and offer greatest promise, the analytical problems are broadly as follows: Composition of initial product or virgin coolant composition of macro components and amounts of organic and inorganic impurities; Coolant during and after operation. Determination of gases and organic compounds produced by pyrolysis and radiolysis (degradation and polymerization products); Control of systems for purifying and regenerating the coolant after use. Dissolved pressurization gases; Detection of intermediate products during decomposition; these are generally very unstable (free radicals); Degree of fouling and film formation. Tests to determine potential formation of films; Corrosion of structural elements and canning materials; Health and safety. Toxicity, inflammability and impurities that can be activated. Although some of the above problems are closely interrelated and entail similar techniques, they vary as to degree of difficulty. Another question is the difficulty of distinguishing clearly between techniques for determining physical and physico-chemical properties, on one hand, and analytical techniques on the other. Any classification is therefore somewhat arbitrary (for example, in the case of dosimetry and techniques for determining mean molecular weights or electrical conductivity

  11. Particle Markov Chain Monte Carlo Techniques of Unobserved Component Time Series Models Using Ox

    DEFF Research Database (Denmark)

    Nonejad, Nima

    This paper details Particle Markov chain Monte Carlo techniques for analysis of unobserved component time series models using several economic data sets. PMCMC combines the particle filter with the Metropolis-Hastings algorithm. Overall PMCMC provides a very compelling, computationally fast...... and efficient framework for estimation. These advantages are used to for instance estimate stochastic volatility models with leverage effect or with Student-t distributed errors. We also model changing time series characteristics of the US inflation rate by considering a heteroskedastic ARFIMA model where...

  12. Nonlinear Time Series Analysis via Neural Networks

    Science.gov (United States)

    Volná, Eva; Janošek, Michal; Kocian, Václav; Kotyrba, Martin

    This article deals with a time series analysis based on neural networks in order to make an effective forex market [Moore and Roche, J. Int. Econ. 58, 387-411 (2002)] pattern recognition. Our goal is to find and recognize important patterns which repeatedly appear in the market history to adapt our trading system behaviour based on them.

  13. Lecture notes for Advanced Time Series Analysis

    DEFF Research Database (Denmark)

    Madsen, Henrik; Holst, Jan

    1997-01-01

    A first version of this notes was used at the lectures in Grenoble, and they are now extended and improved (together with Jan Holst), and used in Ph.D. courses on Advanced Time Series Analysis at IMM and at the Department of Mathematical Statistics, University of Lund, 1994, 1997, ...

  14. A taylor series approach to survival analysis

    International Nuclear Information System (INIS)

    Brodsky, J.B.; Groer, P.G.

    1984-09-01

    A method of survival analysis using hazard functions is developed. The method uses the well known mathematical theory for Taylor Series. Hypothesis tests of the adequacy of many statistical models, including proportional hazards and linear and/or quadratic dose responses, are obtained. A partial analysis of leukemia mortality in the Life Span Study cohort is used as an example. Furthermore, a relatively robust estimation procedure for the proportional hazards model is proposed. (author)

  15. Time series analysis of barometric pressure data

    International Nuclear Information System (INIS)

    La Rocca, Paola; Riggi, Francesco; Riggi, Daniele

    2010-01-01

    Time series of atmospheric pressure data, collected over a period of several years, were analysed to provide undergraduate students with educational examples of application of simple statistical methods of analysis. In addition to basic methods for the analysis of periodicities, a comparison of two forecast models, one based on autoregression algorithms, and the other making use of an artificial neural network, was made. Results show that the application of artificial neural networks may give slightly better results compared to traditional methods.

  16. The Statistical Analysis of Time Series

    CERN Document Server

    Anderson, T W

    2011-01-01

    The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences George

  17. Innovative techniques to analyze time series of geomagnetic activity indices

    Science.gov (United States)

    Balasis, Georgios; Papadimitriou, Constantinos; Daglis, Ioannis A.; Potirakis, Stelios M.; Eftaxias, Konstantinos

    2016-04-01

    Magnetic storms are undoubtedly among the most important phenomena in space physics and also a central subject of space weather. The non-extensive Tsallis entropy has been recently introduced, as an effective complexity measure for the analysis of the geomagnetic activity Dst index. The Tsallis entropy sensitively shows the complexity dissimilarity among different "physiological" (normal) and "pathological" states (intense magnetic storms). More precisely, the Tsallis entropy implies the emergence of two distinct patterns: (i) a pattern associated with the intense magnetic storms, which is characterized by a higher degree of organization, and (ii) a pattern associated with normal periods, which is characterized by a lower degree of organization. Other entropy measures such as Block Entropy, T-Complexity, Approximate Entropy, Sample Entropy and Fuzzy Entropy verify the above mentioned result. Importantly, the wavelet spectral analysis in terms of Hurst exponent, H, also shows the existence of two different patterns: (i) a pattern associated with the intense magnetic storms, which is characterized by a fractional Brownian persistent behavior (ii) a pattern associated with normal periods, which is characterized by a fractional Brownian anti-persistent behavior. Finally, we observe universality in the magnetic storm and earthquake dynamics, on a basis of a modified form of the Gutenberg-Richter law for the Tsallis statistics. This finding suggests a common approach to the interpretation of both phenomena in terms of the same driving physical mechanism. Signatures of discrete scale invariance in Dst time series further supports the aforementioned proposal.

  18. Fourier analysis of time series an introduction

    CERN Document Server

    Bloomfield, Peter

    2000-01-01

    A new, revised edition of a yet unrivaled work on frequency domain analysis Long recognized for his unique focus on frequency domain methods for the analysis of time series data as well as for his applied, easy-to-understand approach, Peter Bloomfield brings his well-known 1976 work thoroughly up to date. With a minimum of mathematics and an engaging, highly rewarding style, Bloomfield provides in-depth discussions of harmonic regression, harmonic analysis, complex demodulation, and spectrum analysis. All methods are clearly illustrated using examples of specific data sets, while ample

  19. Nonlinear time series analysis with R

    CERN Document Server

    Huffaker, Ray; Rosa, Rodolfo

    2017-01-01

    In the process of data analysis, the investigator is often facing highly-volatile and random-appearing observed data. A vast body of literature shows that the assumption of underlying stochastic processes was not necessarily representing the nature of the processes under investigation and, when other tools were used, deterministic features emerged. Non Linear Time Series Analysis (NLTS) allows researchers to test whether observed volatility conceals systematic non linear behavior, and to rigorously characterize governing dynamics. Behavioral patterns detected by non linear time series analysis, along with scientific principles and other expert information, guide the specification of mechanistic models that serve to explain real-world behavior rather than merely reproducing it. Often there is a misconception regarding the complexity of the level of mathematics needed to understand and utilize the tools of NLTS (for instance Chaos theory). However, mathematics used in NLTS is much simpler than many other subjec...

  20. A neuro-fuzzy computing technique for modeling hydrological time series

    Science.gov (United States)

    Nayak, P. C.; Sudheer, K. P.; Rangan, D. M.; Ramasastri, K. S.

    2004-05-01

    Intelligent computing tools such as artificial neural network (ANN) and fuzzy logic approaches are proven to be efficient when applied individually to a variety of problems. Recently there has been a growing interest in combining both these approaches, and as a result, neuro-fuzzy computing techniques have evolved. This approach has been tested and evaluated in the field of signal processing and related areas, but researchers have only begun evaluating the potential of this neuro-fuzzy hybrid approach in hydrologic modeling studies. This paper presents the application of an adaptive neuro fuzzy inference system (ANFIS) to hydrologic time series modeling, and is illustrated by an application to model the river flow of Baitarani River in Orissa state, India. An introduction to the ANFIS modeling approach is also presented. The advantage of the method is that it does not require the model structure to be known a priori, in contrast to most of the time series modeling techniques. The results showed that the ANFIS forecasted flow series preserves the statistical properties of the original flow series. The model showed good performance in terms of various statistical indices. The results are highly promising, and a comparative analysis suggests that the proposed modeling approach outperforms ANNs and other traditional time series models in terms of computational speed, forecast errors, efficiency, peak flow estimation etc. It was observed that the ANFIS model preserves the potential of the ANN approach fully, and eases the model building process.

  1. Multivariate analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Bendavid, Josh [European Organization for Nuclear Research (CERN), Geneva (Switzerland); Fisher, Wade C. [Michigan State Univ., East Lansing, MI (United States); Junk, Thomas R. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States)

    2016-01-01

    The end products of experimental data analysis are designed to be simple and easy to understand: hypothesis tests and measurements of parameters. But, the experimental data themselves are voluminous and complex. Furthermore, in modern collider experiments, many petabytes of data must be processed in search of rare new processes which occur together with much more copious background processes that are of less interest to the task at hand. The systematic uncertainties on the background may be larger than the expected signal in many cases. The statistical power of an analysis and its sensitivity to systematic uncertainty can therefore usually both be improved by separating signal events from background events with higher efficiency and purity.

  2. Soil analysis. Modern instrumental technique

    International Nuclear Information System (INIS)

    Smith, K.A.

    1993-01-01

    This book covers traditional methods of analysis and specialist monographs on individual instrumental techniques, which are usually not written with soil or plant analysis specifically in mind. The principles of the techniques are combined with discussions of sample preparation and matrix problems, and critical reviews of applications in soil science and related disciplines. Individual chapters are processed separately for inclusion in the appropriate data bases

  3. An operator expansion technique for path integral analysis

    International Nuclear Information System (INIS)

    Tsvetkov, I.V.

    1995-01-01

    A new method of path integral analysis in the framework of a power series technique is presented. The method is based on the operator expansion of an exponential. A regular procedure to calculate the correction terms is found. (orig.)

  4. Surface analysis the principal techniques

    CERN Document Server

    Vickerman, John C

    2009-01-01

    This completely updated and revised second edition of Surface Analysis: The Principal Techniques, deals with the characterisation and understanding of the outer layers of substrates, how they react, look and function which are all of interest to surface scientists. Within this comprehensive text, experts in each analysis area introduce the theory and practice of the principal techniques that have shown themselves to be effective in both basic research and in applied surface analysis. Examples of analysis are provided to facilitate the understanding of this topic and to show readers how they c

  5. Chaotic time series analysis in economics: Balance and perspectives

    International Nuclear Information System (INIS)

    Faggini, Marisa

    2014-01-01

    The aim of the paper is not to review the large body of work concerning nonlinear time series analysis in economics, about which much has been written, but rather to focus on the new techniques developed to detect chaotic behaviours in economic data. More specifically, our attention will be devoted to reviewing some of these techniques and their application to economic and financial data in order to understand why chaos theory, after a period of growing interest, appears now not to be such an interesting and promising research area

  6. Chaotic time series analysis in economics: Balance and perspectives

    Energy Technology Data Exchange (ETDEWEB)

    Faggini, Marisa, E-mail: mfaggini@unisa.it [Dipartimento di Scienze Economiche e Statistiche, Università di Salerno, Fisciano 84084 (Italy)

    2014-12-15

    The aim of the paper is not to review the large body of work concerning nonlinear time series analysis in economics, about which much has been written, but rather to focus on the new techniques developed to detect chaotic behaviours in economic data. More specifically, our attention will be devoted to reviewing some of these techniques and their application to economic and financial data in order to understand why chaos theory, after a period of growing interest, appears now not to be such an interesting and promising research area.

  7. Gold analysis by the gamma absorption technique

    International Nuclear Information System (INIS)

    Kurtoglu, Arzu; Tugrul, A.B.

    2003-01-01

    Gold (Au) analyses are generally performed using destructive techniques. In this study, the Gamma Absorption Technique has been employed for gold analysis. A series of different gold alloys of known gold content were analysed and a calibration curve was obtained. This curve was then used for the analysis of unknown samples. Gold analyses can be made non-destructively, easily and quickly by the gamma absorption technique. The mass attenuation coefficients of the alloys were measured around the K-shell absorption edge of Au. Theoretical mass attenuation coefficient values were obtained using the WinXCom program and comparison of the experimental results with the theoretical values showed generally good and acceptable agreement

  8. Discontinuous conduction mode analysis of phase-modulated series ...

    Indian Academy of Sciences (India)

    modulated dc–dc series resonant converter (SRC) operating in discontinuous conduction mode (DCM). The conventional fundamental harmonic approximation technique is extended for a non-ideal series resonant tank to clarify the limitations of ...

  9. Time series analysis methods and applications for flight data

    CERN Document Server

    Zhang, Jianye

    2017-01-01

    This book focuses on different facets of flight data analysis, including the basic goals, methods, and implementation techniques. As mass flight data possesses the typical characteristics of time series, the time series analysis methods and their application for flight data have been illustrated from several aspects, such as data filtering, data extension, feature optimization, similarity search, trend monitoring, fault diagnosis, and parameter prediction, etc. An intelligent information-processing platform for flight data has been established to assist in aircraft condition monitoring, training evaluation and scientific maintenance. The book will serve as a reference resource for people working in aviation management and maintenance, as well as researchers and engineers in the fields of data analysis and data mining.

  10. The Timeseries Toolbox - A Web Application to Enable Accessible, Reproducible Time Series Analysis

    Science.gov (United States)

    Veatch, W.; Friedman, D.; Baker, B.; Mueller, C.

    2017-12-01

    The vast majority of data analyzed by climate researchers are repeated observations of physical process or time series data. This data lends itself of a common set of statistical techniques and models designed to determine trends and variability (e.g., seasonality) of these repeated observations. Often, these same techniques and models can be applied to a wide variety of different time series data. The Timeseries Toolbox is a web application designed to standardize and streamline these common approaches to time series analysis and modeling with particular attention to hydrologic time series used in climate preparedness and resilience planning and design by the U. S. Army Corps of Engineers. The application performs much of the pre-processing of time series data necessary for more complex techniques (e.g. interpolation, aggregation). With this tool, users can upload any dataset that conforms to a standard template and immediately begin applying these techniques to analyze their time series data.

  11. Biostatistics series module 9: Survival analysis

    Directory of Open Access Journals (Sweden)

    Avijit Hazra

    2017-01-01

    Full Text Available Survival analysis is concerned with “time to event“ data. Conventionally, it dealt with cancer death as the event in question, but it can handle any event occurring over a time frame, and this need not be always adverse in nature. When the outcome of a study is the time to an event, it is often not possible to wait until the event in question has happened to all the subjects, for example, until all are dead. In addition, subjects may leave the study prematurely. Such situations lead to what is called censored observations as complete information is not available for these subjects. The data set is thus an assemblage of times to the event in question and times after which no more information on the individual is available. Survival analysis methods are the only techniques capable of handling censored observations without treating them as missing data. They also make no assumption regarding normal distribution of time to event data. Descriptive methods for exploring survival times in a sample include life table and Kaplan–Meier techniques as well as various kinds of distribution fitting as advanced modeling techniques. The Kaplan–Meier cumulative survival probability over time plot has become the signature plot for biomedical survival analysis. Several techniques are available for comparing the survival experience in two or more groups – the log-rank test is popularly used. This test can also be used to produce an odds ratio as an estimate of risk of the event in the test group; this is called hazard ratio (HR. Limitations of the traditional log-rank test have led to various modifications and enhancements. Finally, survival analysis offers different regression models for estimating the impact of multiple predictors on survival. Cox's proportional hazard model is the most general of the regression methods that allows the hazard function to be modeled on a set of explanatory variables without making restrictive assumptions concerning the

  12. Bulk analysis using nuclear techniques

    International Nuclear Information System (INIS)

    Borsaru, M.; Holmes, R.J.; Mathew, P.J.

    1983-01-01

    Bulk analysis techniques developed for the mining industry are reviewed. Using penetrating neutron and #betta#-radiations, measurements are obtained directly from a large volume of sample (3-30 kg) #betta#-techniques were used to determine the grade of iron ore and to detect shale on conveyor belts. Thermal neutron irradiation was developed for the simultaneous determination of iron and aluminium in iron ore on a conveyor belt. Thermal-neutron activation analysis includes the determination of alumina in bauxite, and manganese and alumina in manganese ore. Fast neutron activation analysis is used to determine silicon in iron ores, and alumina and silica in bauxite. Fast and thermal neutron activation has been used to determine the soil in shredded sugar cane. (U.K.)

  13. Notes on economic time series analysis system theoretic perspectives

    CERN Document Server

    Aoki, Masanao

    1983-01-01

    In seminars and graduate level courses I have had several opportunities to discuss modeling and analysis of time series with economists and economic graduate students during the past several years. These experiences made me aware of a gap between what economic graduate students are taught about vector-valued time series and what is available in recent system literature. Wishing to fill or narrow the gap that I suspect is more widely spread than my personal experiences indicate, I have written these notes to augment and reor­ ganize materials I have given in these courses and seminars. I have endeavored to present, in as much a self-contained way as practicable, a body of results and techniques in system theory that I judge to be relevant and useful to economists interested in using time series in their research. I have essentially acted as an intermediary and interpreter of system theoretic results and perspectives in time series by filtering out non-essential details, and presenting coherent accounts of wha...

  14. Assessing Spontaneous Combustion Instability with Nonlinear Time Series Analysis

    Science.gov (United States)

    Eberhart, C. J.; Casiano, M. J.

    2015-01-01

    Considerable interest lies in the ability to characterize the onset of spontaneous instabilities within liquid propellant rocket engine (LPRE) combustion devices. Linear techniques, such as fast Fourier transforms, various correlation parameters, and critical damping parameters, have been used at great length for over fifty years. Recently, nonlinear time series methods have been applied to deduce information pertaining to instability incipiency hidden in seemingly stochastic combustion noise. A technique commonly used in biological sciences known as the Multifractal Detrended Fluctuation Analysis has been extended to the combustion dynamics field, and is introduced here as a data analysis approach complementary to linear ones. Advancing, a modified technique is leveraged to extract artifacts of impending combustion instability that present themselves a priori growth to limit cycle amplitudes. Analysis is demonstrated on data from J-2X gas generator testing during which a distinct spontaneous instability was observed. Comparisons are made to previous work wherein the data were characterized using linear approaches. Verification of the technique is performed by examining idealized signals and comparing two separate, independently developed tools.

  15. Advanced Techniques of Stress Analysis

    Directory of Open Access Journals (Sweden)

    Simion TATARU

    2013-12-01

    Full Text Available This article aims to check the stress analysis technique based on 3D models also making a comparison with the traditional technique which utilizes a model built directly into the stress analysis program. This comparison of the two methods will be made with reference to the rear fuselage of IAR-99 aircraft, structure with a high degree of complexity which allows a meaningful evaluation of both approaches. Three updated databases are envisaged: the database having the idealized model obtained using ANSYS and working directly on documentation, without automatic generation of nodes and elements (with few exceptions, the rear fuselage database (performed at this stage obtained with Pro/ ENGINEER and the one obtained by using ANSYS with the second database. Then, each of the three databases will be used according to arising necessities.The main objective is to develop the parameterized model of the rear fuselage using the computer aided design software Pro/ ENGINEER. A review of research regarding the use of virtual reality with the interactive analysis performed by the finite element method is made to show the state- of- the-art achieved in this field.

  16. Nonparametric factor analysis of time series

    OpenAIRE

    Rodríguez-Poo, Juan M.; Linton, Oliver Bruce

    1998-01-01

    We introduce a nonparametric smoothing procedure for nonparametric factor analaysis of multivariate time series. The asymptotic properties of the proposed procedures are derived. We present an application based on the residuals from the Fair macromodel.

  17. Pipe-anchor discontinuity analysis utilizing power series solutions, Bessel functions, and Fourier series

    International Nuclear Information System (INIS)

    Williams, Dennis K.; Ranson, William F.

    2003-01-01

    One of the paradigmatic classes of problems that frequently arise in piping stress analysis discipline is the effect of local stresses created by supports and restraints attachments. Over the past 20 years, concerns have been identified by both regulatory agencies in the nuclear power industry and others in the process and chemicals industries concerning the effect of various stiff clamping arrangements on the expected life of the pipe and its various piping components. In many of the commonly utilized geometries and arrangements of pipe clamps, the elasticity problem becomes the axisymmetric stress and deformation determination in a hollow cylinder (pipe) subjected to the appropriate boundary conditions and respective loads per se. One of the geometries that serve as a pipe anchor is comprised of two pipe clamps that are bolted tightly to the pipe and affixed to a modified shoe-type arrangement. The shoe is employed for the purpose of providing an immovable base that can be easily attached either by bolting or welding to a structural steel pipe rack. Over the past 50 years, the computational tools available to the piping analyst have changed dramatically and thereby have caused the implementation of solutions to the basic problems of elasticity to change likewise. The need to obtain closed form elasticity solutions, however, has always been a driving force in engineering. The employment of symbolic calculus that is currently available through numerous software packages makes closed form solutions very economical. This paper briefly traces the solutions over the past 50 years to a variety of axisymmetric stress problems involving hollow circular cylinders employing a Fourier series representation. In the present example, a properly chosen Fourier series represent the mathematical simulation of the imposed axial displacements on the outside diametrical surface. A general solution technique is introduced for the axisymmetric discontinuity stresses resulting from an

  18. Techniques for Automated Performance Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Marcus, Ryan C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-09-02

    The performance of a particular HPC code depends on a multitude of variables, including compiler selection, optimization flags, OpenMP pool size, file system load, memory usage, MPI configuration, etc. As a result of this complexity, current predictive models have limited applicability, especially at scale. We present a formulation of scientific codes, nodes, and clusters that reduces complex performance analysis to well-known mathematical techniques. Building accurate predictive models and enhancing our understanding of scientific codes at scale is an important step towards exascale computing.

  19. Contributions to fuzzy polynomial techniques for stability analysis and control

    OpenAIRE

    Pitarch Pérez, José Luis

    2014-01-01

    The present thesis employs fuzzy-polynomial control techniques in order to improve the stability analysis and control of nonlinear systems. Initially, it reviews the more extended techniques in the field of Takagi-Sugeno fuzzy systems, such as the more relevant results about polynomial and fuzzy polynomial systems. The basic framework uses fuzzy polynomial models by Taylor series and sum-of-squares techniques (semidefinite programming) in order to obtain stability guarantees...

  20. DIY Solar Market Analysis Webinar Series: Solar Resource and Technical

    Science.gov (United States)

    Series: Solar Resource and Technical Potential DIY Solar Market Analysis Webinar Series: Solar Resource and Technical Potential Wednesday, June 11, 2014 As part of a Do-It-Yourself Solar Market Analysis Potential | State, Local, and Tribal Governments | NREL DIY Solar Market Analysis Webinar

  1. Time series analysis for psychological research: examining and forecasting change.

    Science.gov (United States)

    Jebb, Andrew T; Tay, Louis; Wang, Wei; Huang, Qiming

    2015-01-01

    Psychological research has increasingly recognized the importance of integrating temporal dynamics into its theories, and innovations in longitudinal designs and analyses have allowed such theories to be formalized and tested. However, psychological researchers may be relatively unequipped to analyze such data, given its many characteristics and the general complexities involved in longitudinal modeling. The current paper introduces time series analysis to psychological research, an analytic domain that has been essential for understanding and predicting the behavior of variables across many diverse fields. First, the characteristics of time series data are discussed. Second, different time series modeling techniques are surveyed that can address various topics of interest to psychological researchers, including describing the pattern of change in a variable, modeling seasonal effects, assessing the immediate and long-term impact of a salient event, and forecasting future values. To illustrate these methods, an illustrative example based on online job search behavior is used throughout the paper, and a software tutorial in R for these analyses is provided in the Supplementary Materials.

  2. Time series analysis for psychological research: examining and forecasting change

    Science.gov (United States)

    Jebb, Andrew T.; Tay, Louis; Wang, Wei; Huang, Qiming

    2015-01-01

    Psychological research has increasingly recognized the importance of integrating temporal dynamics into its theories, and innovations in longitudinal designs and analyses have allowed such theories to be formalized and tested. However, psychological researchers may be relatively unequipped to analyze such data, given its many characteristics and the general complexities involved in longitudinal modeling. The current paper introduces time series analysis to psychological research, an analytic domain that has been essential for understanding and predicting the behavior of variables across many diverse fields. First, the characteristics of time series data are discussed. Second, different time series modeling techniques are surveyed that can address various topics of interest to psychological researchers, including describing the pattern of change in a variable, modeling seasonal effects, assessing the immediate and long-term impact of a salient event, and forecasting future values. To illustrate these methods, an illustrative example based on online job search behavior is used throughout the paper, and a software tutorial in R for these analyses is provided in the Supplementary Materials. PMID:26106341

  3. Time series analysis in the social sciences the fundamentals

    CERN Document Server

    Shin, Youseop

    2017-01-01

    Times Series Analysis in the Social Sciences is a practical and highly readable introduction written exclusively for students and researchers whose mathematical background is limited to basic algebra. The book focuses on fundamental elements of time series analysis that social scientists need to understand so they can employ time series analysis for their research and practice. Through step-by-step explanations and using monthly violent crime rates as case studies, this book explains univariate time series from the preliminary visual analysis through the modeling of seasonality, trends, and re

  4. A Novel Generation Method for the PV Power Time Series Combining the Decomposition Technique and Markov Chain Theory

    DEFF Research Database (Denmark)

    Xu, Shenzhi; Ai, Xiaomeng; Fang, Jiakun

    2017-01-01

    Photovoltaic (PV) power generation has made considerable developments in recent years. But its intermittent and volatility of its output has seriously affected the security operation of the power system. In order to better understand the PV generation and provide sufficient data support...... for analysis the impacts, a novel generation method for PV power time series combining decomposition technique and Markov chain theory is presented in this paper. It digs important factors from historical data from existing PV plants and then reproduce new data with similar patterns. In detail, the proposed...... method first decomposes the PV power time series into ideal output curve, amplitude parameter series and random fluctuating component three parts. Then generating daily ideal output curve by the extraction of typical daily data, amplitude parameter series based on the Markov chain Monte Carlo (MCMC...

  5. TIME SERIES ANALYSIS USING A UNIQUE MODEL OF TRANSFORMATION

    Directory of Open Access Journals (Sweden)

    Goran Klepac

    2007-12-01

    Full Text Available REFII1 model is an authorial mathematical model for time series data mining. The main purpose of that model is to automate time series analysis, through a unique transformation model of time series. An advantage of this approach of time series analysis is the linkage of different methods for time series analysis, linking traditional data mining tools in time series, and constructing new algorithms for analyzing time series. It is worth mentioning that REFII model is not a closed system, which means that we have a finite set of methods. At first, this is a model for transformation of values of time series, which prepares data used by different sets of methods based on the same model of transformation in a domain of problem space. REFII model gives a new approach in time series analysis based on a unique model of transformation, which is a base for all kind of time series analysis. The advantage of REFII model is its possible application in many different areas such as finance, medicine, voice recognition, face recognition and text mining.

  6. Mathematical foundations of time series analysis a concise introduction

    CERN Document Server

    Beran, Jan

    2017-01-01

    This book provides a concise introduction to the mathematical foundations of time series analysis, with an emphasis on mathematical clarity. The text is reduced to the essential logical core, mostly using the symbolic language of mathematics, thus enabling readers to very quickly grasp the essential reasoning behind time series analysis. It appeals to anybody wanting to understand time series in a precise, mathematical manner. It is suitable for graduate courses in time series analysis but is equally useful as a reference work for students and researchers alike.

  7. Multiscale multifractal multiproperty analysis of financial time series based on Rényi entropy

    Science.gov (United States)

    Yujun, Yang; Jianping, Li; Yimei, Yang

    This paper introduces a multiscale multifractal multiproperty analysis based on Rényi entropy (3MPAR) method to analyze short-range and long-range characteristics of financial time series, and then applies this method to the five time series of five properties in four stock indices. Combining the two analysis techniques of Rényi entropy and multifractal detrended fluctuation analysis (MFDFA), the 3MPAR method focuses on the curves of Rényi entropy and generalized Hurst exponent of five properties of four stock time series, which allows us to study more universal and subtle fluctuation characteristics of financial time series. By analyzing the curves of the Rényi entropy and the profiles of the logarithm distribution of MFDFA of five properties of four stock indices, the 3MPAR method shows some fluctuation characteristics of the financial time series and the stock markets. Then, it also shows a richer information of the financial time series by comparing the profile of five properties of four stock indices. In this paper, we not only focus on the multifractality of time series but also the fluctuation characteristics of the financial time series and subtle differences in the time series of different properties. We find that financial time series is far more complex than reported in some research works using one property of time series.

  8. Water temperature forecasting and estimation using fourier series and communication theory techniques

    International Nuclear Information System (INIS)

    Long, L.L.

    1976-01-01

    Fourier series and statistical communication theory techniques are utilized in the estimation of river water temperature increases caused by external thermal inputs. An example estimate assuming a constant thermal input is demonstrated. A regression fit of the Fourier series approximation of temperature is then used to forecast daily average water temperatures. Also, a 60-day prediction of daily average water temperature is made with the aid of the Fourier regression fit by using significant Fourier components

  9. Time Series Analysis Using Geometric Template Matching.

    Science.gov (United States)

    Frank, Jordan; Mannor, Shie; Pineau, Joelle; Precup, Doina

    2013-03-01

    We present a novel framework for analyzing univariate time series data. At the heart of the approach is a versatile algorithm for measuring the similarity of two segments of time series called geometric template matching (GeTeM). First, we use GeTeM to compute a similarity measure for clustering and nearest-neighbor classification. Next, we present a semi-supervised learning algorithm that uses the similarity measure with hierarchical clustering in order to improve classification performance when unlabeled training data are available. Finally, we present a boosting framework called TDEBOOST, which uses an ensemble of GeTeM classifiers. TDEBOOST augments the traditional boosting approach with an additional step in which the features used as inputs to the classifier are adapted at each step to improve the training error. We empirically evaluate the proposed approaches on several datasets, such as accelerometer data collected from wearable sensors and ECG data.

  10. Growth And Export Expansion In Mauritius - A Time Series Analysis ...

    African Journals Online (AJOL)

    Growth And Export Expansion In Mauritius - A Time Series Analysis. ... RV Sannassee, R Pearce ... Using Granger Causality tests, the short-run analysis results revealed that there is significant reciprocal causality between real export earnings ...

  11. Volatility Analysis of Bitcoin Price Time Series

    Directory of Open Access Journals (Sweden)

    Lukáš Pichl

    2017-12-01

    Full Text Available Bitcoin has the largest share in the total capitalization of cryptocurrency markets currently reaching above 70 billion USD. In this work we focus on the price of Bitcoin in terms of standard currencies and their volatility over the last five years. The average day-to-day return throughout this period is 0.328%, amounting in exponential growth from 6 USD to over 4,000 USD per 1 BTC at present. Multi-scale analysis is performed from the level of the tick data, through the 5 min, 1 hour and 1 day scales. Distribution of trading volumes (1 sec, 1 min, 1 hour and 1 day aggregated from the Kraken BTCEUR tick data is provided that shows the artifacts of algorithmic trading (selling transactions with volume peaks distributed at integer multiples of BTC unit. Arbitrage opportunities are studied using the EUR, USD and CNY currencies. Whereas the arbitrage spread for EUR-USD currency pair is found narrow at the order of a percent, at the 1 hour sampling period the arbitrage spread for USD-CNY (and similarly EUR-CNY is found to be more substantial, reaching as high as above 5 percent on rare occasions. The volatility of BTC exchange rates is modeled using the day-to-day distribution of logarithmic return, and the Realized Volatility, sum of the squared logarithmic returns on 5-minute basis. In this work we demonstrate that the Heterogeneous Autoregressive model for Realized Volatility Andersen et al. (2007 applies reasonably well to the BTCUSD dataset. Finally, a feed-forward neural network with 2 hidden layers using 10-day moving window sampling daily return predictors is applied to estimate the next-day logarithmic return. The results show that such an artificial neural network prediction is capable of approximate capture of the actual log return distribution; more sophisticated methods, such as recurrent neural networks and LSTM (Long Short Term Memory techniques from deep learning may be necessary for higher prediction accuracy.

  12. Stochastic time series analysis of hydrology data for water resources

    Science.gov (United States)

    Sathish, S.; Khadar Babu, S. K.

    2017-11-01

    The prediction to current publication of stochastic time series analysis in hydrology and seasonal stage. The different statistical tests for predicting the hydrology time series on Thomas-Fiering model. The hydrology time series of flood flow have accept a great deal of consideration worldwide. The concentration of stochastic process areas of time series analysis method are expanding with develop concerns about seasonal periods and global warming. The recent trend by the researchers for testing seasonal periods in the hydrologic flowseries using stochastic process on Thomas-Fiering model. The present article proposed to predict the seasonal periods in hydrology using Thomas-Fiering model.

  13. On-line analysis of reactor noise using time-series analysis

    International Nuclear Information System (INIS)

    McGevna, V.G.

    1981-10-01

    A method to allow use of time series analysis for on-line noise analysis has been developed. On-line analysis of noise in nuclear power reactors has been limited primarily to spectral analysis and related frequency domain techniques. Time series analysis has many distinct advantages over spectral analysis in the automated processing of reactor noise. However, fitting an autoregressive-moving average (ARMA) model to time series data involves non-linear least squares estimation. Unless a high speed, general purpose computer is available, the calculations become too time consuming for on-line applications. To eliminate this problem, a special purpose algorithm was developed for fitting ARMA models. While it is based on a combination of steepest descent and Taylor series linearization, properties of the ARMA model are used so that the auto- and cross-correlation functions can be used to eliminate the need for estimating derivatives. The number of calculations, per iteration varies lineegardless of the mee 0.2% yield strength displayed anisotropy, with axial and circumferential values being greater than radial. For CF8-CPF8 and CF8M-CPF8M castings to meet current ASME Code S acid fuel cells

  14. Economic Analysis in Series-Distillation Desalination

    Directory of Open Access Journals (Sweden)

    Mirna Rahmah Lubis

    2010-06-01

    Full Text Available The ability to produce potable water economically is the primary purpose of seawater desalination research. Reverse osmosis (RO and multi-stage flash (MSF cost more than potable water produced from fresh water resources. Therefore, this research investigates a high-efficiency mechanical vapor-compression distillation system that employs an improved water flow arrangement. The incoming salt concentration was 0.15% salt for brackish water and 3.5% salt for seawater, whereas the outgoing salt concentration was 1.5% and 7%, respectively. Distillation was performed at 439 K and 722 kPa for both brackish water feed and seawater feed. Water costs of the various conditions were calculated for brackish water and seawater feeds using optimum conditions considered as 25 and 20 stages, respectively. For brackish water at a temperature difference of 0.96 K, the energy requirement is 2.0 kWh/m3. At this condition, the estimated water cost is $0.39/m3 achieved with 10,000,000 gal/day distillate, 30-year bond, 5% interest rate, and $0.05/kWh electricity. For seawater at a temperature difference of 0.44 K, the energy requirement is 3.97 kWh/m3 and the estimated water cost is $0.61/m3. Greater efficiency of the vapor compression system is achieved by connecting multiple evaporators in series, rather than the traditional parallel arrangement. The efficiency results from the gradual increase of salinity in each stage of the series arrangement in comparison to parallel. Calculations using various temperature differences between boiling brine and condensing steam show the series arrangement has the greatest improvement at lower temperature differences. Keywords: desalination, dropwise condensation, mechanical-vapor compression

  15. Analysis of JET ELMy time series

    International Nuclear Information System (INIS)

    Zvejnieks, G.; Kuzovkov, V.N.

    2005-01-01

    Full text: Achievement of the planned operational regime in the next generation tokamaks (such as ITER) still faces principal problems. One of the main challenges is obtaining the control of edge localized modes (ELMs), which should lead to both long plasma pulse times and reasonable divertor life time. In order to control ELMs the hypothesis was proposed by Degeling [1] that ELMs exhibit features of chaotic dynamics and thus a standard chaos control methods might be applicable. However, our findings which are based on the nonlinear autoregressive (NAR) model contradict this hypothesis for JET ELMy time-series. In turn, it means that ELM behavior is of a relaxation or random type. These conclusions coincide with our previous results obtained for ASDEX Upgrade time series [2]. [1] A.W. Degeling, Y.R. Martin, P.E. Bak, J. B.Lister, and X. Llobet, Plasma Phys. Control. Fusion 43, 1671 (2001). [2] G. Zvejnieks, V.N. Kuzovkov, O. Dumbrajs, A.W. Degeling, W. Suttrop, H. Urano, and H. Zohm, Physics of Plasmas 11, 5658 (2004)

  16. A technique for filling gaps in time series with complicated power spectra

    International Nuclear Information System (INIS)

    Brown, T.M.

    1984-01-01

    Fahlman and Ulrych (1982) describe a method for estimating the power and phase spectra of gapped time series, using a maximum-entropy reconstruction of the data in the gaps. It has proved difficult to apply this technique to solar oscillations data, because of the great complexity of the solar oscillations spectrum. We describe a means for avoiding this difficulty, and report the results of a series of blind tests of the modified technique. The main results of these tests are: 1. Gap-filling gives good results, provided that the signal-to-noise ration in the original data is large enough, and provided the gaps are short enough. For low-noise data, the duty cycle of the observations should not be less than about 50%. 2. The frequencies and widths of narrow spectrum features are well reproduced by the technique. 3. The technique systematically reduces the apparent amplitudes of small features in the spectrum relative to large ones. (orig.)

  17. The Recoverability of P-Technique Factor Analysis

    Science.gov (United States)

    Molenaar, Peter C. M.; Nesselroade, John R.

    2009-01-01

    It seems that just when we are about to lay P-technique factor analysis finally to rest as obsolete because of newer, more sophisticated multivariate time-series models using latent variables--dynamic factor models--it rears its head to inform us that an obituary may be premature. We present the results of some simulations demonstrating that even…

  18. STUDIES IN ASTRONOMICAL TIME SERIES ANALYSIS. VI. BAYESIAN BLOCK REPRESENTATIONS

    Energy Technology Data Exchange (ETDEWEB)

    Scargle, Jeffrey D. [Space Science and Astrobiology Division, MS 245-3, NASA Ames Research Center, Moffett Field, CA 94035-1000 (United States); Norris, Jay P. [Physics Department, Boise State University, 2110 University Drive, Boise, ID 83725-1570 (United States); Jackson, Brad [The Center for Applied Mathematics and Computer Science, Department of Mathematics, San Jose State University, One Washington Square, MH 308, San Jose, CA 95192-0103 (United States); Chiang, James, E-mail: jeffrey.d.scargle@nasa.gov [W. W. Hansen Experimental Physics Laboratory, Kavli Institute for Particle Astrophysics and Cosmology, Department of Physics and SLAC National Accelerator Laboratory, Stanford University, Stanford, CA 94305 (United States)

    2013-02-20

    This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it-an improved and generalized version of Bayesian Blocks-that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piecewise linear and piecewise exponential representations, multivariate time series data, analysis of variance, data on the circle, other data modes, and dispersed data. Simulations provide evidence that the detection efficiency for weak signals is close to a theoretical asymptotic limit derived by Arias-Castro et al. In the spirit of Reproducible Research all of the code and data necessary to reproduce all of the figures in this paper are included as supplementary material.

  19. Studies in Astronomical Time Series Analysis. VI. Bayesian Block Representations

    Science.gov (United States)

    Scargle, Jeffrey D.; Norris, Jay P.; Jackson, Brad; Chiang, James

    2013-01-01

    This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it-an improved and generalized version of Bayesian Blocks [Scargle 1998]-that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piece- wise linear and piecewise exponential representations, multivariate time series data, analysis of variance, data on the circle, other data modes, and dispersed data. Simulations provide evidence that the detection efficiency for weak signals is close to a theoretical asymptotic limit derived by [Arias-Castro, Donoho and Huo 2003]. In the spirit of Reproducible Research [Donoho et al. (2008)] all of the code and data necessary to reproduce all of the figures in this paper are included as auxiliary material.

  20. STUDIES IN ASTRONOMICAL TIME SERIES ANALYSIS. VI. BAYESIAN BLOCK REPRESENTATIONS

    International Nuclear Information System (INIS)

    Scargle, Jeffrey D.; Norris, Jay P.; Jackson, Brad; Chiang, James

    2013-01-01

    This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it—an improved and generalized version of Bayesian Blocks—that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piecewise linear and piecewise exponential representations, multivariate time series data, analysis of variance, data on the circle, other data modes, and dispersed data. Simulations provide evidence that the detection efficiency for weak signals is close to a theoretical asymptotic limit derived by Arias-Castro et al. In the spirit of Reproducible Research all of the code and data necessary to reproduce all of the figures in this paper are included as supplementary material.

  1. Management of odontogenic cysts by endonasal endoscopic techniques: A systematic review and case series.

    Science.gov (United States)

    Marino, Michael J; Luong, Amber; Yao, William C; Citardi, Martin J

    2018-01-01

    Odontogenic cysts and tumors of the maxilla may be amendable to management by endonasal endoscopic techniques, which may reduce the morbidity associated with open procedures and avoid difficult reconstruction. To perform a systematic review that evaluates the feasibility and outcomes of endoscopic techniques in the management of different odontogenic cysts. A case series of our experience with these minimally invasive techniques was assembled for insight into the technical aspects of these procedures. The Preferred Reporting Items for Systematic Reviews and Meta-Analyses was used to identify English-language studies that reported the use of endoscopic techniques in the management of odontogenic cysts. Several medical literature data bases were searched for all occurrences in the title or abstract of the terms "odontogenic" and "endoscopic" between January 1, 1950, and October 1, 2016. Publications were evaluated for the technique used, histopathology, complications, recurrences, and the follow-up period. A case series of patients who presented to a tertiary rhinology clinic and who underwent treatment of odontogenic cysts by an endoscopic technique was included. A systematic review identified 16 case reports or series that described the use of endoscopic techniques for the treatment of odontogenic cysts, including 45 total patients. Histopathologies encountered were radicular (n = 16) and dentigerous cysts (n = 10), and keratocystic odontogenic tumor (n = 12). There were no reported recurrences or major complications for a mean follow-up of 29 months. A case series of patients in our institution identified seven patients without recurrence for a mean follow-up of 10 months. Endonasal endoscopic treatment of various odontogenic cysts are described in the literature and are associated with effective treatment of these lesions for an average follow-up period of >2 years. These techniques have the potential to reduce morbidity associated with the resection of these

  2. Tool Wear Monitoring Using Time Series Analysis

    Science.gov (United States)

    Song, Dong Yeul; Ohara, Yasuhiro; Tamaki, Haruo; Suga, Masanobu

    A tool wear monitoring approach considering the nonlinear behavior of cutting mechanism caused by tool wear and/or localized chipping is proposed, and its effectiveness is verified through the cutting experiment and actual turning machining. Moreover, the variation in the surface roughness of the machined workpiece is also discussed using this approach. In this approach, the residual error between the actually measured vibration signal and the estimated signal obtained from the time series model corresponding to dynamic model of cutting is introduced as the feature of diagnosis. Consequently, it is found that the early tool wear state (i.e. flank wear under 40µm) can be monitored, and also the optimal tool exchange time and the tool wear state for actual turning machining can be judged by this change in the residual error. Moreover, the variation of surface roughness Pz in the range of 3 to 8µm can be estimated by the monitoring of the residual error.

  3. Remote-Sensing Time Series Analysis, a Vegetation Monitoring Tool

    Science.gov (United States)

    McKellip, Rodney; Prados, Donald; Ryan, Robert; Ross, Kenton; Spruce, Joseph; Gasser, Gerald; Greer, Randall

    2008-01-01

    The Time Series Product Tool (TSPT) is software, developed in MATLAB , which creates and displays high signal-to- noise Vegetation Indices imagery and other higher-level products derived from remotely sensed data. This tool enables automated, rapid, large-scale regional surveillance of crops, forests, and other vegetation. TSPT temporally processes high-revisit-rate satellite imagery produced by the Moderate Resolution Imaging Spectroradiometer (MODIS) and by other remote-sensing systems. Although MODIS imagery is acquired daily, cloudiness and other sources of noise can greatly reduce the effective temporal resolution. To improve cloud statistics, the TSPT combines MODIS data from multiple satellites (Aqua and Terra). The TSPT produces MODIS products as single time-frame and multitemporal change images, as time-series plots at a selected location, or as temporally processed image videos. Using the TSPT program, MODIS metadata is used to remove and/or correct bad and suspect data. Bad pixel removal, multiple satellite data fusion, and temporal processing techniques create high-quality plots and animated image video sequences that depict changes in vegetation greenness. This tool provides several temporal processing options not found in other comparable imaging software tools. Because the framework to generate and use other algorithms is established, small modifications to this tool will enable the use of a large range of remotely sensed data types. An effective remote-sensing crop monitoring system must be able to detect subtle changes in plant health in the earliest stages, before the effects of a disease outbreak or other adverse environmental conditions can become widespread and devastating. The integration of the time series analysis tool with ground-based information, soil types, crop types, meteorological data, and crop growth models in a Geographic Information System, could provide the foundation for a large-area crop-surveillance system that could identify

  4. Affirmative Action. Module Number 16. Work Experience Program Modules. Coordination Techniques Series.

    Science.gov (United States)

    Shawhan, Carl; Morley, Ray

    This self-instructional module, the last in a series of 16 on techniques for coordinating work experience programs, deals with affirmative action. Addressed in the module are the following topics: the nature of affirmative action legislation and regulations, the role of the teacher-coordinator as a resource person for affirmative action…

  5. Inorganic chemical analysis of environmental materials—A lecture series

    Science.gov (United States)

    Crock, J.G.; Lamothe, P.J.

    2011-01-01

    At the request of the faculty of the Colorado School of Mines, Golden, Colorado, the authors prepared and presented a lecture series to the students of a graduate level advanced instrumental analysis class. The slides and text presented in this report are a compilation and condensation of this series of lectures. The purpose of this report is to present the slides and notes and to emphasize the thought processes that should be used by a scientist submitting samples for analyses in order to procure analytical data to answer a research question. First and foremost, the analytical data generated can be no better than the samples submitted. The questions to be answered must first be well defined and the appropriate samples collected from the population that will answer the question. The proper methods of analysis, including proper sample preparation and digestion techniques, must then be applied. Care must be taken to achieve the required limits of detection of the critical analytes to yield detectable analyte concentration (above "action" levels) for the majority of the study's samples and to address what portion of those analytes answer the research question-total or partial concentrations. To guarantee a robust analytical result that answers the research question(s), a well-defined quality assurance and quality control (QA/QC) plan must be employed. This QA/QC plan must include the collection and analysis of field and laboratory blanks, sample duplicates, and matrix-matched standard reference materials (SRMs). The proper SRMs may include in-house materials and/or a selection of widely available commercial materials. A discussion of the preparation and applicability of in-house reference materials is also presented. Only when all these analytical issues are sufficiently addressed can the research questions be answered with known certainty.

  6. Metagenomics meets time series analysis: unraveling microbial community dynamics

    NARCIS (Netherlands)

    Faust, K.; Lahti, L.M.; Gonze, D.; Vos, de W.M.; Raes, J.

    2015-01-01

    The recent increase in the number of microbial time series studies offers new insights into the stability and dynamics of microbial communities, from the world's oceans to human microbiota. Dedicated time series analysis tools allow taking full advantage of these data. Such tools can reveal periodic

  7. Time-series analysis for ambient concentrations

    Energy Technology Data Exchange (ETDEWEB)

    Gonzalez-Manteiga, W.; Prada-Sanchez, J.M.; Cao, R.; Garcia-Jurado, I.; Febrero-Bande, M.; Lucas-Dominguez, T. (Santiago de Compostela University, Santiago de Compostela (Spain). Dept. of Statistics and Operations Research)

    1993-02-01

    In this paper a dynamic system is presented which has been implemented to predict, every 5 min, the ambient concentrations of SO[sub 2] in the neighbourhood of a power station run by ENDESA, the National Electricity Company of Spain, in As Pontes. This prediction task is very important in order to prevent a high ground-level of concentration of SO[sub 2]. For forecasting a mixed model is used which has a parametric component and a nonparametric one. Confidence intervals are also constructed for future observations using bootstrap and classical techniques. 4 refs., 5 figs., 3 tabs.

  8. Interrupted time-series analysis: studying trends in neurosurgery.

    Science.gov (United States)

    Wong, Ricky H; Smieliauskas, Fabrice; Pan, I-Wen; Lam, Sandi K

    2015-12-01

    OBJECT Neurosurgery studies traditionally have evaluated the effects of interventions on health care outcomes by studying overall changes in measured outcomes over time. Yet, this type of linear analysis is limited due to lack of consideration of the trend's effects both pre- and postintervention and the potential for confounding influences. The aim of this study was to illustrate interrupted time-series analysis (ITSA) as applied to an example in the neurosurgical literature and highlight ITSA's potential for future applications. METHODS The methods used in previous neurosurgical studies were analyzed and then compared with the methodology of ITSA. RESULTS The ITSA method was identified in the neurosurgical literature as an important technique for isolating the effect of an intervention (such as a policy change or a quality and safety initiative) on a health outcome independent of other factors driving trends in the outcome. The authors determined that ITSA allows for analysis of the intervention's immediate impact on outcome level and on subsequent trends and enables a more careful measure of the causal effects of interventions on health care outcomes. CONCLUSIONS ITSA represents a significant improvement over traditional observational study designs in quantifying the impact of an intervention. ITSA is a useful statistical procedure to understand, consider, and implement as the field of neurosurgery evolves in sophistication in big-data analytics, economics, and health services research.

  9. Independent component analysis: A new possibility for analysing series of electron energy loss spectra

    International Nuclear Information System (INIS)

    Bonnet, Nogl; Nuzillard, Danielle

    2005-01-01

    A complementary approach is proposed for analysing series of electron energy-loss spectra that can be recorded with the spectrum-line technique, across an interface for instance. This approach, called blind source separation (BSS) or independent component analysis (ICA), complements two existing methods: the spatial difference approach and multivariate statistical analysis. The principle of the technique is presented and illustrations are given through one simulated example and one real example

  10. Complexity testing techniques for time series data: A comprehensive literature review

    International Nuclear Information System (INIS)

    Tang, Ling; Lv, Huiling; Yang, Fengmei; Yu, Lean

    2015-01-01

    Highlights: • A literature review of complexity testing techniques for time series data is provided. • Complexity measurements can generally fall into fractality, methods derived from nonlinear dynamics and entropy. • Different types investigate time series data from different perspectives. • Measures, applications and future studies for each type are presented. - Abstract: Complexity may be one of the most important measurements for analysing time series data; it covers or is at least closely related to different data characteristics within nonlinear system theory. This paper provides a comprehensive literature review examining the complexity testing techniques for time series data. According to different features, the complexity measurements for time series data can be divided into three primary groups, i.e., fractality (mono- or multi-fractality) for self-similarity (or system memorability or long-term persistence), methods derived from nonlinear dynamics (via attractor invariants or diagram descriptions) for attractor properties in phase-space, and entropy (structural or dynamical entropy) for the disorder state of a nonlinear system. These estimations analyse time series dynamics from different perspectives but are closely related to or even dependent on each other at the same time. In particular, a weaker self-similarity, a more complex structure of attractor, and a higher-level disorder state of a system consistently indicate that the observed time series data are at a higher level of complexity. Accordingly, this paper presents a historical tour of the important measures and works for each group, as well as ground-breaking and recent applications and future research directions.

  11. Biological time series analysis using a context free language: applicability to pulsatile hormone data.

    Directory of Open Access Journals (Sweden)

    Dennis A Dean

    Full Text Available We present a novel approach for analyzing biological time-series data using a context-free language (CFL representation that allows the extraction and quantification of important features from the time-series. This representation results in Hierarchically AdaPtive (HAP analysis, a suite of multiple complementary techniques that enable rapid analysis of data and does not require the user to set parameters. HAP analysis generates hierarchically organized parameter distributions that allow multi-scale components of the time-series to be quantified and includes a data analysis pipeline that applies recursive analyses to generate hierarchically organized results that extend traditional outcome measures such as pharmacokinetics and inter-pulse interval. Pulsicons, a novel text-based time-series representation also derived from the CFL approach, are introduced as an objective qualitative comparison nomenclature. We apply HAP to the analysis of 24 hours of frequently sampled pulsatile cortisol hormone data, which has known analysis challenges, from 14 healthy women. HAP analysis generated results in seconds and produced dozens of figures for each participant. The results quantify the observed qualitative features of cortisol data as a series of pulse clusters, each consisting of one or more embedded pulses, and identify two ultradian phenotypes in this dataset. HAP analysis is designed to be robust to individual differences and to missing data and may be applied to other pulsatile hormones. Future work can extend HAP analysis to other time-series data types, including oscillatory and other periodic physiological signals.

  12. Topic Time Series Analysis of Microblogs

    Science.gov (United States)

    2014-10-01

    may be distributed more globally. Tweets on a specific topic that cluster spatially, temporally or both might be of interest to analysts, marketers ...of $ and @, with the latter only in the case that it is the only character in the token (the @ symbol is significant in its usage by Instagram in...is generated by Instagram . Topic 80, Distance: 143.2101 Top words: 1. rawr 2. ˆ0ˆ 3. kill 4. jurassic 5. dinosaur Analysis: This topic is quite

  13. Time Series Factor Analysis with an Application to Measuring Money

    NARCIS (Netherlands)

    Gilbert, Paul D.; Meijer, Erik

    2005-01-01

    Time series factor analysis (TSFA) and its associated statistical theory is developed. Unlike dynamic factor analysis (DFA), TSFA obviates the need for explicitly modeling the process dynamics of the underlying phenomena. It also differs from standard factor analysis (FA) in important respects: the

  14. Reliability analysis techniques in power plant design

    International Nuclear Information System (INIS)

    Chang, N.E.

    1981-01-01

    An overview of reliability analysis techniques is presented as applied to power plant design. The key terms, power plant performance, reliability, availability and maintainability are defined. Reliability modeling, methods of analysis and component reliability data are briefly reviewed. Application of reliability analysis techniques from a design engineering approach to improving power plant productivity is discussed. (author)

  15. Time averaging, ageing and delay analysis of financial time series

    Science.gov (United States)

    Cherstvy, Andrey G.; Vinod, Deepak; Aghion, Erez; Chechkin, Aleksei V.; Metzler, Ralf

    2017-06-01

    We introduce three strategies for the analysis of financial time series based on time averaged observables. These comprise the time averaged mean squared displacement (MSD) as well as the ageing and delay time methods for varying fractions of the financial time series. We explore these concepts via statistical analysis of historic time series for several Dow Jones Industrial indices for the period from the 1960s to 2015. Remarkably, we discover a simple universal law for the delay time averaged MSD. The observed features of the financial time series dynamics agree well with our analytical results for the time averaged measurables for geometric Brownian motion, underlying the famed Black-Scholes-Merton model. The concepts we promote here are shown to be useful for financial data analysis and enable one to unveil new universal features of stock market dynamics.

  16. Time Series Analysis of the Quasar PKS 1749+096

    Science.gov (United States)

    Lam, Michael T.; Balonek, T. J.

    2011-01-01

    Multiple timescales of variability are observed in quasars at a variety of wavelengths, the nature of which is not fully understood. In 2007 and 2008, the quasar 1749+096 underwent two unprecedented optical outbursts, reaching a brightness never before seen in our twenty years of monitoring. Much lower level activity had been seen prior to these two outbursts. We present an analysis of the timescales of variability over the two regimes using a variety of statistical techniques. An IDL software package developed at Colgate University over the summer of 2010, the Quasar User Interface (QUI), provides effective computation of four time series functions for analyzing underlying trends present in generic, discretely sampled data sets. Using the Autocorrelation Function, Structure Function, and Power Spectrum, we are able to quickly identify possible variability timescales. QUI is also capable of computing the Cross-Correlation Function for comparing variability at different wavelengths. We apply these algorithms to 1749+096 and present our analysis of the timescales for this object. Funding for this project was received from Colgate University, the Justus and Jayne Schlichting Student Research Fund, and the NASA / New York Space Grant.

  17. Time Series Analysis of Insar Data: Methods and Trends

    Science.gov (United States)

    Osmanoglu, Batuhan; Sunar, Filiz; Wdowinski, Shimon; Cano-Cabral, Enrique

    2015-01-01

    Time series analysis of InSAR data has emerged as an important tool for monitoring and measuring the displacement of the Earth's surface. Changes in the Earth's surface can result from a wide range of phenomena such as earthquakes, volcanoes, landslides, variations in ground water levels, and changes in wetland water levels. Time series analysis is applied to interferometric phase measurements, which wrap around when the observed motion is larger than one-half of the radar wavelength. Thus, the spatio-temporal ''unwrapping" of phase observations is necessary to obtain physically meaningful results. Several different algorithms have been developed for time series analysis of InSAR data to solve for this ambiguity. These algorithms may employ different models for time series analysis, but they all generate a first-order deformation rate, which can be compared to each other. However, there is no single algorithm that can provide optimal results in all cases. Since time series analyses of InSAR data are used in a variety of applications with different characteristics, each algorithm possesses inherently unique strengths and weaknesses. In this review article, following a brief overview of InSAR technology, we discuss several algorithms developed for time series analysis of InSAR data using an example set of results for measuring subsidence rates in Mexico City.

  18. Stochastic Analysis : A Series of Lectures

    CERN Document Server

    Dozzi, Marco; Flandoli, Franco; Russo, Francesco

    2015-01-01

    This book presents in thirteen refereed survey articles an overview of modern activity in stochastic analysis, written by leading international experts. The topics addressed include stochastic fluid dynamics and regularization by noise of deterministic dynamical systems; stochastic partial differential equations driven by Gaussian or Lévy noise, including the relationship between parabolic equations and particle systems, and wave equations in a geometric framework; Malliavin calculus and applications to stochastic numerics; stochastic integration in Banach spaces; porous media-type equations; stochastic deformations of classical mechanics and Feynman integrals and stochastic differential equations with reflection. The articles are based on short courses given at the Centre Interfacultaire Bernoulli of the Ecole Polytechnique Fédérale de Lausanne, Switzerland, from January to June 2012. They offer a valuable resource not only for specialists, but also for other researchers and Ph.D. students in the fields o...

  19. Surgical techniques for the treatment of ankyloglossia in children: a case series

    Directory of Open Access Journals (Sweden)

    Marina Azevedo JUNQUEIRA

    2014-06-01

    Full Text Available This paper reports a series of clinical cases of ankyloglossia in children, which were approached by different techniques: frenotomy and frenectomy with the use of one hemostat, two hemostats, a groove director or laser. Information on the indications, contraindications, advantages and disadvantages of the techniques was also presented. Children diagnosed with ankyloglossia were subjected to different surgical procedures. The choice of the techniques was based on the age of the patient, length of the frenulum and availability of the instruments and equipment. All the techniques presented are successful for the treatment of ankyloglossia and require a skilled professional. Laser may be considered a simple and safe alternative for children while reducing the amount of local anesthetics needed, the bleeding and the chances of infection, swelling and discomfort.

  20. Nonlinear techniques for forecasting solar activity directly from its time series

    Science.gov (United States)

    Ashrafi, S.; Roszman, L.; Cooley, J.

    1993-01-01

    This paper presents numerical techniques for constructing nonlinear predictive models to forecast solar flux directly from its time series. This approach makes it possible to extract dynamical in variants of our system without reference to any underlying solar physics. We consider the dynamical evolution of solar activity in a reconstructed phase space that captures the attractor (strange), give a procedure for constructing a predictor of future solar activity, and discuss extraction of dynamical invariants such as Lyapunov exponents and attractor dimension.

  1. Multivariate time series analysis with R and financial applications

    CERN Document Server

    Tsay, Ruey S

    2013-01-01

    Since the publication of his first book, Analysis of Financial Time Series, Ruey Tsay has become one of the most influential and prominent experts on the topic of time series. Different from the traditional and oftentimes complex approach to multivariate (MV) time series, this sequel book emphasizes structural specification, which results in simplified parsimonious VARMA modeling and, hence, eases comprehension. Through a fundamental balance between theory and applications, the book supplies readers with an accessible approach to financial econometric models and their applications to real-worl

  2. Elements of nonlinear time series analysis and forecasting

    CERN Document Server

    De Gooijer, Jan G

    2017-01-01

    This book provides an overview of the current state-of-the-art of nonlinear time series analysis, richly illustrated with examples, pseudocode algorithms and real-world applications. Avoiding a “theorem-proof” format, it shows concrete applications on a variety of empirical time series. The book can be used in graduate courses in nonlinear time series and at the same time also includes interesting material for more advanced readers. Though it is largely self-contained, readers require an understanding of basic linear time series concepts, Markov chains and Monte Carlo simulation methods. The book covers time-domain and frequency-domain methods for the analysis of both univariate and multivariate (vector) time series. It makes a clear distinction between parametric models on the one hand, and semi- and nonparametric models/methods on the other. This offers the reader the option of concentrating exclusively on one of these nonlinear time series analysis methods. To make the book as user friendly as possible...

  3. A Reception Analysis on the Youth Audiences of TV Series in Marivan

    Directory of Open Access Journals (Sweden)

    Omid Karimi

    2014-03-01

    Full Text Available The aim of this article is to describe the role of foreign media as the agitators of popular culture. For that with reception analysis it’s pay to describe decoding of youth audiences about this series. Globalization theory and Reception in Communication theory are formed the theoretical system of current article. The methodology in this research is qualitative one, and two techniques as in-depth interview and observation are used for data collection. The results show different people based on individual features, social and cultural backgrounds have inclination toward special characters and identify with them. This inclination so far the audience fallow the series because of his/her favorite character. Also there is a great compatibility between audience backgrounds and their receptions. A number of audience have criticized the series and point out the negative consequences on its society. However, seeing the series continue; really they prefer watching series enjoying to risks of it.

  4. Review and classification of variability analysis techniques with clinical applications.

    Science.gov (United States)

    Bravi, Andrea; Longtin, André; Seely, Andrew J E

    2011-10-10

    Analysis of patterns of variation of time-series, termed variability analysis, represents a rapidly evolving discipline with increasing applications in different fields of science. In medicine and in particular critical care, efforts have focussed on evaluating the clinical utility of variability. However, the growth and complexity of techniques applicable to this field have made interpretation and understanding of variability more challenging. Our objective is to provide an updated review of variability analysis techniques suitable for clinical applications. We review more than 70 variability techniques, providing for each technique a brief description of the underlying theory and assumptions, together with a summary of clinical applications. We propose a revised classification for the domains of variability techniques, which include statistical, geometric, energetic, informational, and invariant. We discuss the process of calculation, often necessitating a mathematical transform of the time-series. Our aims are to summarize a broad literature, promote a shared vocabulary that would improve the exchange of ideas, and the analyses of the results between different studies. We conclude with challenges for the evolving science of variability analysis.

  5. Review and classification of variability analysis techniques with clinical applications

    Science.gov (United States)

    2011-01-01

    Analysis of patterns of variation of time-series, termed variability analysis, represents a rapidly evolving discipline with increasing applications in different fields of science. In medicine and in particular critical care, efforts have focussed on evaluating the clinical utility of variability. However, the growth and complexity of techniques applicable to this field have made interpretation and understanding of variability more challenging. Our objective is to provide an updated review of variability analysis techniques suitable for clinical applications. We review more than 70 variability techniques, providing for each technique a brief description of the underlying theory and assumptions, together with a summary of clinical applications. We propose a revised classification for the domains of variability techniques, which include statistical, geometric, energetic, informational, and invariant. We discuss the process of calculation, often necessitating a mathematical transform of the time-series. Our aims are to summarize a broad literature, promote a shared vocabulary that would improve the exchange of ideas, and the analyses of the results between different studies. We conclude with challenges for the evolving science of variability analysis. PMID:21985357

  6. On-line condition monitoring of nuclear systems via symbolic time series analysis

    International Nuclear Information System (INIS)

    Rajagopalan, V.; Ray, A.; Garcia, H. E.

    2006-01-01

    This paper provides a symbolic time series analysis approach to fault diagnostics and condition monitoring. The proposed technique is built upon concepts from wavelet theory, symbolic dynamics and pattern recognition. Various aspects of the methodology such as wavelet selection, choice of alphabet and determination of depth of D-Markov Machine are explained in the paper. The technique is validated with experiments performed in a Machine Condition Monitoring (MCM) test bed at the Idaho National Laboratory. (authors)

  7. Nuclear analysis techniques and environmental sciences

    International Nuclear Information System (INIS)

    1997-10-01

    31 theses are collected in this book. It introduced molecular activation analysis micro-PIXE and micro-probe analysis, x-ray fluorescence analysis and accelerator mass spectrometry. The applications about these nuclear analysis techniques are presented and reviewed for environmental sciences

  8. Chemical analysis by nuclear techniques

    International Nuclear Information System (INIS)

    Sohn, S. C.; Kim, W. H.; Park, Y. J.; Park, Y. J.; Song, B. C.; Jeon, Y. S.; Jee, K. Y.; Pyo, H. Y.

    2002-01-01

    This state art report consists of four parts, production of micro-particles, analysis of boron, alpha tracking method and development of neutron induced prompt gamma ray spectroscopy (NIPS) system. The various methods for the production of micro-paticles such as mechanical method, electrolysis method, chemical method, spray method were described in the first part. The second part contains sample treatment, separation and concentration, analytical method, and application of boron analysis. The third part contains characteristics of alpha track, track dectectors, pretreatment of sample, neutron irradiation, etching conditions for various detectors, observation of track on the detector, etc. The last part contains basic theory, neutron source, collimator, neutron shields, calibration of NIPS, and application of NIPS system

  9. Chemical analysis by nuclear techniques

    Energy Technology Data Exchange (ETDEWEB)

    Sohn, S. C.; Kim, W. H.; Park, Y. J.; Song, B. C.; Jeon, Y. S.; Jee, K. Y.; Pyo, H. Y

    2002-01-01

    This state art report consists of four parts, production of micro-particles, analysis of boron, alpha tracking method and development of neutron induced prompt gamma ray spectroscopy (NIPS) system. The various methods for the production of micro-paticles such as mechanical method, electrolysis method, chemical method, spray method were described in the first part. The second part contains sample treatment, separation and concentration, analytical method, and application of boron analysis. The third part contains characteristics of alpha track, track dectectors, pretreatment of sample, neutron irradiation, etching conditions for various detectors, observation of track on the detector, etc. The last part contains basic theory, neutron source, collimator, neutron shields, calibration of NIPS, and application of NIPS system.

  10. Nonlinear time series analysis of the human electrocardiogram

    International Nuclear Information System (INIS)

    Perc, Matjaz

    2005-01-01

    We analyse the human electrocardiogram with simple nonlinear time series analysis methods that are appropriate for graduate as well as undergraduate courses. In particular, attention is devoted to the notions of determinism and stationarity in physiological data. We emphasize that methods of nonlinear time series analysis can be successfully applied only if the studied data set originates from a deterministic stationary system. After positively establishing the presence of determinism and stationarity in the studied electrocardiogram, we calculate the maximal Lyapunov exponent, thus providing interesting insights into the dynamics of the human heart. Moreover, to facilitate interest and enable the integration of nonlinear time series analysis methods into the curriculum at an early stage of the educational process, we also provide user-friendly programs for each implemented method

  11. Koopman Operator Framework for Time Series Modeling and Analysis

    Science.gov (United States)

    Surana, Amit

    2018-01-01

    We propose an interdisciplinary framework for time series classification, forecasting, and anomaly detection by combining concepts from Koopman operator theory, machine learning, and linear systems and control theory. At the core of this framework is nonlinear dynamic generative modeling of time series using the Koopman operator which is an infinite-dimensional but linear operator. Rather than working with the underlying nonlinear model, we propose two simpler linear representations or model forms based on Koopman spectral properties. We show that these model forms are invariants of the generative model and can be readily identified directly from data using techniques for computing Koopman spectral properties without requiring the explicit knowledge of the generative model. We also introduce different notions of distances on the space of such model forms which is essential for model comparison/clustering. We employ the space of Koopman model forms equipped with distance in conjunction with classical machine learning techniques to develop a framework for automatic feature generation for time series classification. The forecasting/anomaly detection framework is based on using Koopman model forms along with classical linear systems and control approaches. We demonstrate the proposed framework for human activity classification, and for time series forecasting/anomaly detection in power grid application.

  12. A reference data set for validating vapor pressure measurement techniques: homologous series of polyethylene glycols

    Science.gov (United States)

    Krieger, Ulrich K.; Siegrist, Franziska; Marcolli, Claudia; Emanuelsson, Eva U.; Gøbel, Freya M.; Bilde, Merete; Marsh, Aleksandra; Reid, Jonathan P.; Huisman, Andrew J.; Riipinen, Ilona; Hyttinen, Noora; Myllys, Nanna; Kurtén, Theo; Bannan, Thomas; Percival, Carl J.; Topping, David

    2018-01-01

    To predict atmospheric partitioning of organic compounds between gas and aerosol particle phase based on explicit models for gas phase chemistry, saturation vapor pressures of the compounds need to be estimated. Estimation methods based on functional group contributions require training sets of compounds with well-established saturation vapor pressures. However, vapor pressures of semivolatile and low-volatility organic molecules at atmospheric temperatures reported in the literature often differ by several orders of magnitude between measurement techniques. These discrepancies exceed the stated uncertainty of each technique which is generally reported to be smaller than a factor of 2. At present, there is no general reference technique for measuring saturation vapor pressures of atmospherically relevant compounds with low vapor pressures at atmospheric temperatures. To address this problem, we measured vapor pressures with different techniques over a wide temperature range for intercomparison and to establish a reliable training set. We determined saturation vapor pressures for the homologous series of polyethylene glycols (H - (O - CH2 - CH2)n - OH) for n = 3 to n = 8 ranging in vapor pressure at 298 K from 10-7 to 5×10-2 Pa and compare them with quantum chemistry calculations. Such a homologous series provides a reference set that covers several orders of magnitude in saturation vapor pressure, allowing a critical assessment of the lower limits of detection of vapor pressures for the different techniques as well as permitting the identification of potential sources of systematic error. Also, internal consistency within the series allows outlying data to be rejected more easily. Most of the measured vapor pressures agreed within the stated uncertainty range. Deviations mostly occurred for vapor pressure values approaching the lower detection limit of a technique. The good agreement between the measurement techniques (some of which are sensitive to the mass

  13. On statistical inference in time series analysis of the evolution of road safety.

    Science.gov (United States)

    Commandeur, Jacques J F; Bijleveld, Frits D; Bergel-Hayat, Ruth; Antoniou, Constantinos; Yannis, George; Papadimitriou, Eleonora

    2013-11-01

    Data collected for building a road safety observatory usually include observations made sequentially through time. Examples of such data, called time series data, include annual (or monthly) number of road traffic accidents, traffic fatalities or vehicle kilometers driven in a country, as well as the corresponding values of safety performance indicators (e.g., data on speeding, seat belt use, alcohol use, etc.). Some commonly used statistical techniques imply assumptions that are often violated by the special properties of time series data, namely serial dependency among disturbances associated with the observations. The first objective of this paper is to demonstrate the impact of such violations to the applicability of standard methods of statistical inference, which leads to an under or overestimation of the standard error and consequently may produce erroneous inferences. Moreover, having established the adverse consequences of ignoring serial dependency issues, the paper aims to describe rigorous statistical techniques used to overcome them. In particular, appropriate time series analysis techniques of varying complexity are employed to describe the development over time, relating the accident-occurrences to explanatory factors such as exposure measures or safety performance indicators, and forecasting the development into the near future. Traditional regression models (whether they are linear, generalized linear or nonlinear) are shown not to naturally capture the inherent dependencies in time series data. Dedicated time series analysis techniques, such as the ARMA-type and DRAG approaches are discussed next, followed by structural time series models, which are a subclass of state space methods. The paper concludes with general recommendations and practice guidelines for the use of time series models in road safety research. Copyright © 2012 Elsevier Ltd. All rights reserved.

  14. Maximum Bandwidth Enhancement of Current Mirror using Series-Resistor and Dynamic Body Bias Technique

    Directory of Open Access Journals (Sweden)

    V. Niranjan

    2014-09-01

    Full Text Available This paper introduces a new approach for enhancing the bandwidth of a low voltage CMOS current mirror. The proposed approach is based on utilizing body effect in a MOS transistor by connecting its gate and bulk terminals together for signal input. This results in boosting the effective transconductance of MOS transistor along with reduction of the threshold voltage. The proposed approach does not affect the DC gain of the current mirror. We demonstrate that the proposed approach features compatibility with widely used series-resistor technique for enhancing the current mirror bandwidth and both techniques have been employed simultaneously for maximum bandwidth enhancement. An important consequence of using both techniques simultaneously is the reduction of the series-resistor value for achieving the same bandwidth. This reduction in value is very attractive because a smaller resistor results in smaller chip area and less noise. PSpice simulation results using 180 nm CMOS technology from TSMC are included to prove the unique results. The proposed current mirror operates at 1Volt consuming only 102 µW and maximum bandwidth extension ratio of 1.85 has been obtained using the proposed approach. Simulation results are in good agreement with analytical predictions.

  15. Time series analysis and its applications with R examples

    CERN Document Server

    Shumway, Robert H

    2017-01-01

    The fourth edition of this popular graduate textbook, like its predecessors, presents a balanced and comprehensive treatment of both time and frequency domain methods with accompanying theory. Numerous examples using nontrivial data illustrate solutions to problems such as discovering natural and anthropogenic climate change, evaluating pain perception experiments using functional magnetic resonance imaging, and monitoring a nuclear test ban treaty. The book is designed as a textbook for graduate level students in the physical, biological, and social sciences and as a graduate level text in statistics. Some parts may also serve as an undergraduate introductory course. Theory and methodology are separated to allow presentations on different levels. In addition to coverage of classical methods of time series regression, ARIMA models, spectral analysis and state-space models, the text includes modern developments including categorical time series analysis, multivariate spectral methods, long memory series, nonli...

  16. Multiresolution analysis of Bursa Malaysia KLCI time series

    Science.gov (United States)

    Ismail, Mohd Tahir; Dghais, Amel Abdoullah Ahmed

    2017-05-01

    In general, a time series is simply a sequence of numbers collected at regular intervals over a period. Financial time series data processing is concerned with the theory and practice of processing asset price over time, such as currency, commodity data, and stock market data. The primary aim of this study is to understand the fundamental characteristics of selected financial time series by using the time as well as the frequency domain analysis. After that prediction can be executed for the desired system for in sample forecasting. In this study, multiresolution analysis which the assist of discrete wavelet transforms (DWT) and maximal overlap discrete wavelet transform (MODWT) will be used to pinpoint special characteristics of Bursa Malaysia KLCI (Kuala Lumpur Composite Index) daily closing prices and return values. In addition, further case study discussions include the modeling of Bursa Malaysia KLCI using linear ARIMA with wavelets to address how multiresolution approach improves fitting and forecasting results.

  17. Time Series Analysis of Wheat Futures Reward in China

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    Different from the fact that the main researches are focused on single futures contract and lack of the comparison of different periods, this paper described the statistical characteristics of wheat futures reward time series of Zhengzhou Commodity Exchange in recent three years. Besides the basic statistic analysis, the paper used the GARCH and EGARCH model to describe the time series which had the ARCH effect and analyzed the persistence of volatility shocks and the leverage effect. The results showed that compared with that of normal one,wheat futures reward series were abnormality, leptokurtic and thick tail distribution. The study also found that two-part of the reward series had no autocorrelation. Among the six correlative series, three ones presented the ARCH effect. By using of the Auto-regressive Distributed Lag Model, GARCH model and EGARCH model, the paper demonstrates the persistence of volatility shocks and the leverage effect on the wheat futures reward time series. The results reveal that on the one hand, the statistical characteristics of the wheat futures reward are similar to the aboard mature futures market as a whole. But on the other hand, the results reflect some shortages such as the immatureness and the over-control by the government in the Chinese future market.

  18. Laurent series expansion of sunrise-type diagrams using configuration space techniques

    International Nuclear Information System (INIS)

    Groote, S.; Koerner, J.G.; Pivovarov, A.A.

    2004-01-01

    We show that configuration space techniques can be used to efficiently calculate the complete Laurent series ε-expansion of sunrise-type diagrams to any loop order in D-dimensional space-time for any external momentum and for arbitrary mass configurations. For negative powers of ε the results are obtained in analytical form. For positive powers of ε including the finite ε 0 contribution the result is obtained numerically in terms of low-dimensional integrals. We present general features of the calculation and provide exemplary results up to five-loop order which are compared to available results in the literature. (orig.)

  19. MCNP perturbation technique for criticality analysis

    International Nuclear Information System (INIS)

    McKinney, G.W.; Iverson, J.L.

    1995-01-01

    The differential operator perturbation technique has been incorporated into the Monte Carlo N-Particle transport code MCNP and will become a standard feature of future releases. This feature includes first and/or second order terms of the Taylor Series expansion for response perturbations related to cross-section data (i.e., density, composition, etc.). Criticality analyses can benefit from this technique in that predicted changes in the track-length tally estimator of K eff may be obtained for multiple perturbations in a single run. A key advantage of this method is that a precise estimate of a small change in response (i.e., < 1%) is easily obtained. This technique can also offer acceptable accuracy, to within a few percent, for up to 20-30% changes in a response

  20. Time series analysis in chaotic diode resonator circuit

    Energy Technology Data Exchange (ETDEWEB)

    Hanias, M.P. [TEI of Chalkis, GR 34400, Evia, Chalkis (Greece)] e-mail: mhanias@teihal.gr; Giannaris, G. [TEI of Chalkis, GR 34400, Evia, Chalkis (Greece); Spyridakis, A. [TEI of Chalkis, GR 34400, Evia, Chalkis (Greece); Rigas, A. [TEI of Chalkis, GR 34400, Evia, Chalkis (Greece)

    2006-01-01

    A diode resonator chaotic circuit is presented. Multisim is used to simulate the circuit and show the presence of chaos. Time series analysis performed by the method proposed by Grasberger and Procaccia. The correlation and minimum embedding dimension {nu} and m {sub min}, respectively, were calculated. Also the corresponding Kolmogorov entropy was calculated.

  1. Time series analysis in chaotic diode resonator circuit

    International Nuclear Information System (INIS)

    Hanias, M.P.; Giannaris, G.; Spyridakis, A.; Rigas, A.

    2006-01-01

    A diode resonator chaotic circuit is presented. Multisim is used to simulate the circuit and show the presence of chaos. Time series analysis performed by the method proposed by Grasberger and Procaccia. The correlation and minimum embedding dimension ν and m min , respectively, were calculated. Also the corresponding Kolmogorov entropy was calculated

  2. Time series analysis of monthly pulpwood use in the Northeast

    Science.gov (United States)

    James T. Bones

    1980-01-01

    Time series analysis was used to develop a model that depicts pulpwood use in the Northeast. The model is useful in forecasting future pulpwood requirements (short term) or monitoring pulpwood-use activity in relation to past use patterns. The model predicted a downturn in use during 1980.

  3. Multi-granular trend detection for time-series analysis

    NARCIS (Netherlands)

    van Goethem, A.I.; Staals, F.; Löffler, M.; Dykes, J.; Speckmann, B.

    2017-01-01

    Time series (such as stock prices) and ensembles (such as model runs for weather forecasts) are two important types of one-dimensional time-varying data. Such data is readily available in large quantities but visual analysis of the raw data quickly becomes infeasible, even for moderately sized data

  4. Time Series Analysis Based on Running Mann Whitney Z Statistics

    Science.gov (United States)

    A sensitive and objective time series analysis method based on the calculation of Mann Whitney U statistics is described. This method samples data rankings over moving time windows, converts those samples to Mann-Whitney U statistics, and then normalizes the U statistics to Z statistics using Monte-...

  5. Time series analysis in astronomy: Limits and potentialities

    DEFF Research Database (Denmark)

    Vio, R.; Kristensen, N.R.; Madsen, Henrik

    2005-01-01

    In this paper we consider the problem of the limits concerning the physical information that can be extracted from the analysis of one or more time series ( light curves) typical of astrophysical objects. On the basis of theoretical considerations and numerical simulations, we show that with no a...

  6. Time Series Analysis of 3D Coordinates Using Nonstochastic Observations

    NARCIS (Netherlands)

    Velsink, H.

    2016-01-01

    Adjustment and testing of a combination of stochastic and nonstochastic observations is applied to the deformation analysis of a time series of 3D coordinates. Nonstochastic observations are constant values that are treated as if they were observations. They are used to formulate constraints on

  7. Time Series Analysis of 3D Coordinates Using Nonstochastic Observations

    NARCIS (Netherlands)

    Hiddo Velsink

    2016-01-01

    From the article: Abstract Adjustment and testing of a combination of stochastic and nonstochastic observations is applied to the deformation analysis of a time series of 3D coordinates. Nonstochastic observations are constant values that are treated as if they were observations. They are used to

  8. Analysis and implementation of LLC-T series parallel resonant ...

    African Journals Online (AJOL)

    A prototype 300 W, 100 kHz converter is designed and built to experimentally demonstrate, dynamic and steady state performance for the LLC-T series parallel resonant converter. A comparative study is performed between experimental results and the simulation studies. The analysis shows that the output of converter is ...

  9. Event tree analysis using artificial intelligence techniques

    International Nuclear Information System (INIS)

    Dixon, B.W.; Hinton, M.F.

    1985-01-01

    Artificial Intelligence (AI) techniques used in Expert Systems and Object Oriented Programming are discussed as they apply to Event Tree Analysis. A SeQUence IMPortance calculator, SQUIMP, is presented to demonstrate the implementation of these techniques. Benefits of using AI methods include ease of programming, efficiency of execution, and flexibility of application. The importance of an appropriate user interface is stressed. 5 figs

  10. Simultaneous determination of radionuclides separable into natural decay series by use of time-interval analysis

    International Nuclear Information System (INIS)

    Hashimoto, Tetsuo; Sanada, Yukihisa; Uezu, Yasuhiro

    2004-01-01

    A delayed coincidence method, time-interval analysis (TIA), has been applied to successive α-α decay events on the millisecond time-scale. Such decay events are part of the 220 Rn→ 216 Po (T 1/2 145 ms) (Th-series) and 219 Rn→ 215 Po (T 1/2 1.78 ms) (Ac-series). By using TIA in addition to measurement of 226 Ra (U-series) from α-spectrometry by liquid scintillation counting (LSC), two natural decay series could be identified and separated. The TIA detection efficiency was improved by using the pulse-shape discrimination technique (PSD) to reject β-pulses, by solvent extraction of Ra combined with simple chemical separation, and by purging the scintillation solution with dry N 2 gas. The U- and Th-series together with the Ac-series were determined, respectively, from alpha spectra and TIA carried out immediately after Ra-extraction. Using the 221 Fr→ 217 At (T 1/2 32.3 ms) decay process as a tracer, overall yields were estimated from application of TIA to the 225 Ra (Np-decay series) at the time of maximum growth. The present method has proven useful for simultaneous determination of three radioactive decay series in environmental samples. (orig.)

  11. A comparison of various forecasting techniques applied to mean hourly wind speed time series

    Energy Technology Data Exchange (ETDEWEB)

    Sfetsos, A. [7 Pirsou Street, Athens (Greece)

    2000-09-01

    This paper presents a comparison of various forecasting approaches, using time series analysis, on mean hourly wind speed data. In addition to the traditional linear (ARMA) models and the commonly used feed forward and recurrent neural networks, other approaches are also examined including the Adaptive Neuro-Fuzzy Inference Systems (ANFIS) and Neural Logic Networks. The developed models are evaluated for their ability to produce accurate and fast forecasts. (Author)

  12. TV content analysis techniques and applications

    CERN Document Server

    Kompatsiaris, Yiannis

    2012-01-01

    The rapid advancement of digital multimedia technologies has not only revolutionized the production and distribution of audiovisual content, but also created the need to efficiently analyze TV programs to enable applications for content managers and consumers. Leaving no stone unturned, TV Content Analysis: Techniques and Applications provides a detailed exploration of TV program analysis techniques. Leading researchers and academics from around the world supply scientifically sound treatment of recent developments across the related subject areas--including systems, architectures, algorithms,

  13. Statistical evaluation of vibration analysis techniques

    Science.gov (United States)

    Milner, G. Martin; Miller, Patrice S.

    1987-01-01

    An evaluation methodology is presented for a selection of candidate vibration analysis techniques applicable to machinery representative of the environmental control and life support system of advanced spacecraft; illustrative results are given. Attention is given to the statistical analysis of small sample experiments, the quantification of detection performance for diverse techniques through the computation of probability of detection versus probability of false alarm, and the quantification of diagnostic performance.

  14. Optimizing Availability of a Framework in Series Configuration Utilizing Markov Model and Monte Carlo Simulation Techniques

    Directory of Open Access Journals (Sweden)

    Mansoor Ahmed Siddiqui

    2017-06-01

    Full Text Available This research work is aimed at optimizing the availability of a framework comprising of two units linked together in series configuration utilizing Markov Model and Monte Carlo (MC Simulation techniques. In this article, effort has been made to develop a maintenance model that incorporates three distinct states for each unit, while taking into account their different levels of deterioration. Calculations are carried out using the proposed model for two distinct cases of corrective repair, namely perfect and imperfect repairs, with as well as without opportunistic maintenance. Initially, results are accomplished using an analytical technique i.e., Markov Model. Validation of the results achieved is later carried out with the help of MC Simulation. In addition, MC Simulation based codes also work well for the frameworks that follow non-exponential failure and repair rates, and thus overcome the limitations offered by the Markov Model.

  15. Constrained principal component analysis and related techniques

    CERN Document Server

    Takane, Yoshio

    2013-01-01

    In multivariate data analysis, regression techniques predict one set of variables from another while principal component analysis (PCA) finds a subspace of minimal dimensionality that captures the largest variability in the data. How can regression analysis and PCA be combined in a beneficial way? Why and when is it a good idea to combine them? What kind of benefits are we getting from them? Addressing these questions, Constrained Principal Component Analysis and Related Techniques shows how constrained PCA (CPCA) offers a unified framework for these approaches.The book begins with four concre

  16. A Fast Multi-layer Subnetwork Connection Method for Time Series InSAR Technique

    Directory of Open Access Journals (Sweden)

    WU Hong'an

    2016-10-01

    Full Text Available Nowadays, times series interferometric synthetic aperture radar (InSAR technique has been widely used in ground deformation monitoring, especially in urban areas where lots of stable point targets can be detected. However, in standard time series InSAR technique, affected by atmospheric correlation distance and the threshold of linear model coherence, the Delaunay triangulation for connecting point targets can be easily separated into many discontinuous subnetworks. Thus it is difficult to retrieve ground deformation in non-urban areas. In order to monitor ground deformation in large areas efficiently, a novel multi-layer subnetwork connection (MLSC method is proposed for connecting all subnetworks. The advantage of the method is that it can quickly reduce the number of subnetworks with valid edges layer-by-layer. This method is compared with the existing complex network connecting mehod. The experimental results demonstrate that the data processing time of the proposed method is only 32.56% of the latter one.

  17. Applications of soft computing in time series forecasting simulation and modeling techniques

    CERN Document Server

    Singh, Pritpal

    2016-01-01

    This book reports on an in-depth study of fuzzy time series (FTS) modeling. It reviews and summarizes previous research work in FTS modeling and also provides a brief introduction to other soft-computing techniques, such as artificial neural networks (ANNs), rough sets (RS) and evolutionary computing (EC), focusing on how these techniques can be integrated into different phases of the FTS modeling approach. In particular, the book describes novel methods resulting from the hybridization of FTS modeling approaches with neural networks and particle swarm optimization. It also demonstrates how a new ANN-based model can be successfully applied in the context of predicting Indian summer monsoon rainfall. Thanks to its easy-to-read style and the clear explanations of the models, the book can be used as a concise yet comprehensive reference guide to fuzzy time series modeling, and will be valuable not only for graduate students, but also for researchers and professionals working for academic, business and governmen...

  18. Time series analysis of ozone data in Isfahan

    Science.gov (United States)

    Omidvari, M.; Hassanzadeh, S.; Hosseinibalam, F.

    2008-07-01

    Time series analysis used to investigate the stratospheric ozone formation and decomposition processes. Different time series methods are applied to detect the reason for extreme high ozone concentrations for each season. Data was convert into seasonal component and frequency domain, the latter has been evaluated by using the Fast Fourier Transform (FFT), spectral analysis. The power density spectrum estimated from the ozone data showed peaks at cycle duration of 22, 20, 36, 186, 365 and 40 days. According to seasonal component analysis most fluctuation was in 1999 and 2000, but the least fluctuation was in 2003. The best correlation between ozone and sun radiation was found in 2000. Other variables which are not available cause to this fluctuation in the 1999 and 2001. The trend of ozone is increasing in 1999 and is decreasing in other years.

  19. Elemental analysis techniques using proton microbeam

    International Nuclear Information System (INIS)

    Sakai, Takuro; Oikawa, Masakazu; Sato, Takahiro

    2005-01-01

    Proton microbeam is a powerful tool for two-dimensional elemental analysis. The analysis is based on Particle Induced X-ray Emission (PIXE) and Particle Induced Gamma-ray Emission (PIGE) techniques. The paper outlines the principles and instruments, and describes the dental application has been done in JAERI Takasaki. (author)

  20. Rainfall Prediction of Indian Peninsula: Comparison of Time Series Based Approach and Predictor Based Approach using Machine Learning Techniques

    Science.gov (United States)

    Dash, Y.; Mishra, S. K.; Panigrahi, B. K.

    2017-12-01

    Prediction of northeast/post monsoon rainfall which occur during October, November and December (OND) over Indian peninsula is a challenging task due to the dynamic nature of uncertain chaotic climate. It is imperative to elucidate this issue by examining performance of different machine leaning (ML) approaches. The prime objective of this research is to compare between a) statistical prediction using historical rainfall observations and global atmosphere-ocean predictors like Sea Surface Temperature (SST) and Sea Level Pressure (SLP) and b) empirical prediction based on a time series analysis of past rainfall data without using any other predictors. Initially, ML techniques have been applied on SST and SLP data (1948-2014) obtained from NCEP/NCAR reanalysis monthly mean provided by the NOAA ESRL PSD. Later, this study investigated the applicability of ML methods using OND rainfall time series for 1948-2014 and forecasted up to 2018. The predicted values of aforementioned methods were verified using observed time series data collected from Indian Institute of Tropical Meteorology and the result revealed good performance of ML algorithms with minimal error scores. Thus, it is found that both statistical and empirical methods are useful for long range climatic projections.

  1. Techniques for sensitivity analysis of SYVAC results

    International Nuclear Information System (INIS)

    Prust, J.O.

    1985-05-01

    Sensitivity analysis techniques may be required to examine the sensitivity of SYVAC model predictions to the input parameter values, the subjective probability distributions assigned to the input parameters and to the relationship between dose and the probability of fatal cancers plus serious hereditary disease in the first two generations of offspring of a member of the critical group. This report mainly considers techniques for determining the sensitivity of dose and risk to the variable input parameters. The performance of a sensitivity analysis technique may be improved by decomposing the model and data into subsets for analysis, making use of existing information on sensitivity and concentrating sampling in regions the parameter space that generates high doses or risks. A number of sensitivity analysis techniques are reviewed for their application to the SYVAC model including four techniques tested in an earlier study by CAP Scientific for the SYVAC project. This report recommends the development now of a method for evaluating the derivative of dose and parameter value and extending the Kruskal-Wallis technique to test for interactions between parameters. It is also recommended that the sensitivity of the output of each sub-model of SYVAC to input parameter values should be examined. (author)

  2. Flow analysis techniques for phosphorus: an overview.

    Science.gov (United States)

    Estela, José Manuel; Cerdà, Víctor

    2005-04-15

    A bibliographical review on the implementation and the results obtained in the use of different flow analytical techniques for the determination of phosphorus is carried out. The sources, occurrence and importance of phosphorus together with several aspects regarding the analysis and terminology used in the determination of this element are briefly described. A classification as well as a brief description of the basis, advantages and disadvantages of the different existing flow techniques, namely; segmented flow analysis (SFA), flow injection analysis (FIA), sequential injection analysis (SIA), all injection analysis (AIA), batch injection analysis (BIA), multicommutated FIA (MCFIA), multisyringe FIA (MSFIA) and multipumped FIA (MPFIA) is also carried out. The most relevant manuscripts regarding the analysis of phosphorus by means of flow techniques are herein classified according to the detection instrumental technique used with the aim to facilitate their study and obtain an overall scope. Finally, the analytical characteristics of numerous flow-methods reported in the literature are provided in the form of a table and their applicability to samples with different matrixes, namely water samples (marine, river, estuarine, waste, industrial, drinking, etc.), soils leachates, plant leaves, toothpaste, detergents, foodstuffs (wine, orange juice, milk), biological samples, sugars, fertilizer, hydroponic solutions, soils extracts and cyanobacterial biofilms are tabulated.

  3. Quality assurance techniques for activation analysis

    International Nuclear Information System (INIS)

    Becker, D.A.

    1984-01-01

    The principles and techniques of quality assurance are applied to the measurement method of activation analysis. Quality assurance is defined to include quality control and quality assessment. Plans for quality assurance include consideration of: personnel; facilities; analytical design; sampling and sample preparation; the measurement process; standards; and documentation. Activation analysis concerns include: irradiation; chemical separation; counting/detection; data collection, and analysis; and calibration. Types of standards discussed include calibration materials and quality assessment materials

  4. Time series analysis of nuclear instrumentation in EBR-II

    International Nuclear Information System (INIS)

    Imel, G.R.

    1996-01-01

    Results of a time series analysis of the scaler count data from the 3 wide range nuclear detectors in the Experimental Breeder Reactor-II are presented. One of the channels was replaced, and it was desired to determine if there was any statistically significant change (ie, improvement) in the channel's response after the replacement. Data were collected from all 3 channels for 16-day periods before and after detector replacement. Time series analysis and statistical tests showed that there was no significant change after the detector replacement. Also, there were no statistically significant differences among the 3 channels, either before or after the replacement. Finally, it was determined that errors in the reactivity change inferred from subcritical count monitoring during fuel handling would be on the other of 20-30 cents for single count intervals

  5. Mathematical methods in time series analysis and digital image processing

    CERN Document Server

    Kurths, J; Maass, P; Timmer, J

    2008-01-01

    The aim of this volume is to bring together research directions in theoretical signal and imaging processing developed rather independently in electrical engineering, theoretical physics, mathematics and the computer sciences. In particular, mathematically justified algorithms and methods, the mathematical analysis of these algorithms, and methods as well as the investigation of connections between methods from time series analysis and image processing are reviewed. An interdisciplinary comparison of these methods, drawing upon common sets of test problems from medicine and geophysical/enviromental sciences, is also addressed. This volume coherently summarizes work carried out in the field of theoretical signal and image processing. It focuses on non-linear and non-parametric models for time series as well as on adaptive methods in image processing.

  6. Bridge flap technique as a single-step solution to mucogingival problems: A case series

    Directory of Open Access Journals (Sweden)

    Vivek Gupta

    2011-01-01

    Full Text Available Shallow vestibule, gingival recession, inadequate width of attached gingiva (AG and aberrant frenum pull are an array of mucogingival problems for which several independent and effective surgical solutions are reported in the literature. This case series reports the effectiveness of the bridge flap technique as a single-step surgical entity for increasing the depth of the vestibule, root coverage, increasing the width of the AG and solving the problem of abnormal frenum pull. Eight patients with 18 teeth altogether having Millers class I, II or III recession along with problems of shallow vestibule, inadequate width of AG and with or without frenum pull underwent this surgical procedure and were followed-up till 9 months post-operatively. The mean root coverage obtained was 55% and the mean average gain in width of the AG was 3.5 mm. The mean percentage gain in clinical attachment level was 41%. The bridge flap technique can be an effective single-step solution for the aforementioned mucogingival problems if present simultaneously in any case, and offers considerable advantages over other mucogingival surgical techniques in terms of simplicity, limited chair-time for the patient and the operator, single surgical intervention for manifold mucogingival problems and low morbidity because of the absence of palatal donor tissue.

  7. A numerical technique for reactor subchannel analysis

    International Nuclear Information System (INIS)

    Fath, Hassan E.S.

    1983-01-01

    A numerical technique is developed for the solution of the transient boundary layer equations with a moving liquid-vapour interface boundary. The technique uses the finite difference method with the velocity components defined over an Eulerian mesh. A system of interface massless markers is defined where the markers move with the flow field according to a simple kinematic relation between the interface geometry and the fluid velocity. Different applications of nuclear engineering interest are reported with some available results. The present technique is capable of predicting the interface profile near the wall which is important in the reactor subchannel analysis

  8. The Application of Clustering Techniques to Citation Data. Research Reports Series B No. 6.

    Science.gov (United States)

    Arms, William Y.; Arms, Caroline

    This report describes research carried out as part of the Design of Information Systems in the Social Sciences (DISISS) project. Cluster analysis techniques were applied to a machine readable file of bibliographic data in the form of cited journal titles in order to identify groupings which could be used to structure bibliographic files. Practical…

  9. Spectral Unmixing Analysis of Time Series Landsat 8 Images

    Science.gov (United States)

    Zhuo, R.; Xu, L.; Peng, J.; Chen, Y.

    2018-05-01

    Temporal analysis of Landsat 8 images opens up new opportunities in the unmixing procedure. Although spectral analysis of time series Landsat imagery has its own advantage, it has rarely been studied. Nevertheless, using the temporal information can provide improved unmixing performance when compared to independent image analyses. Moreover, different land cover types may demonstrate different temporal patterns, which can aid the discrimination of different natures. Therefore, this letter presents time series K-P-Means, a new solution to the problem of unmixing time series Landsat imagery. The proposed approach is to obtain the "purified" pixels in order to achieve optimal unmixing performance. The vertex component analysis (VCA) is used to extract endmembers for endmember initialization. First, nonnegative least square (NNLS) is used to estimate abundance maps by using the endmember. Then, the estimated endmember is the mean value of "purified" pixels, which is the residual of the mixed pixel after excluding the contribution of all nondominant endmembers. Assembling two main steps (abundance estimation and endmember update) into the iterative optimization framework generates the complete algorithm. Experiments using both simulated and real Landsat 8 images show that the proposed "joint unmixing" approach provides more accurate endmember and abundance estimation results compared with "separate unmixing" approach.

  10. Dynamical analysis and visualization of tornadoes time series.

    Directory of Open Access Journals (Sweden)

    António M Lopes

    Full Text Available In this paper we analyze the behavior of tornado time-series in the U.S. from the perspective of dynamical systems. A tornado is a violently rotating column of air extending from a cumulonimbus cloud down to the ground. Such phenomena reveal features that are well described by power law functions and unveil characteristics found in systems with long range memory effects. Tornado time series are viewed as the output of a complex system and are interpreted as a manifestation of its dynamics. Tornadoes are modeled as sequences of Dirac impulses with amplitude proportional to the events size. First, a collection of time series involving 64 years is analyzed in the frequency domain by means of the Fourier transform. The amplitude spectra are approximated by power law functions and their parameters are read as an underlying signature of the system dynamics. Second, it is adopted the concept of circular time and the collective behavior of tornadoes analyzed. Clustering techniques are then adopted to identify and visualize the emerging patterns.

  11. Dynamical analysis and visualization of tornadoes time series.

    Science.gov (United States)

    Lopes, António M; Tenreiro Machado, J A

    2015-01-01

    In this paper we analyze the behavior of tornado time-series in the U.S. from the perspective of dynamical systems. A tornado is a violently rotating column of air extending from a cumulonimbus cloud down to the ground. Such phenomena reveal features that are well described by power law functions and unveil characteristics found in systems with long range memory effects. Tornado time series are viewed as the output of a complex system and are interpreted as a manifestation of its dynamics. Tornadoes are modeled as sequences of Dirac impulses with amplitude proportional to the events size. First, a collection of time series involving 64 years is analyzed in the frequency domain by means of the Fourier transform. The amplitude spectra are approximated by power law functions and their parameters are read as an underlying signature of the system dynamics. Second, it is adopted the concept of circular time and the collective behavior of tornadoes analyzed. Clustering techniques are then adopted to identify and visualize the emerging patterns.

  12. Hydrological time series modeling: A comparison between adaptive neuro-fuzzy, neural network and autoregressive techniques

    Science.gov (United States)

    Lohani, A. K.; Kumar, Rakesh; Singh, R. D.

    2012-06-01

    SummaryTime series modeling is necessary for the planning and management of reservoirs. More recently, the soft computing techniques have been used in hydrological modeling and forecasting. In this study, the potential of artificial neural networks and neuro-fuzzy system in monthly reservoir inflow forecasting are examined by developing and comparing monthly reservoir inflow prediction models, based on autoregressive (AR), artificial neural networks (ANNs) and adaptive neural-based fuzzy inference system (ANFIS). To take care the effect of monthly periodicity in the flow data, cyclic terms are also included in the ANN and ANFIS models. Working with time series flow data of the Sutlej River at Bhakra Dam, India, several ANN and adaptive neuro-fuzzy models are trained with different input vectors. To evaluate the performance of the selected ANN and adaptive neural fuzzy inference system (ANFIS) models, comparison is made with the autoregressive (AR) models. The ANFIS model trained with the input data vector including previous inflows and cyclic terms of monthly periodicity has shown a significant improvement in the forecast accuracy in comparison with the ANFIS models trained with the input vectors considering only previous inflows. In all cases ANFIS gives more accurate forecast than the AR and ANN models. The proposed ANFIS model coupled with the cyclic terms is shown to provide better representation of the monthly inflow forecasting for planning and operation of reservoir.

  13. Analysis of archaeological pieces with nuclear techniques; Analisis de piezas arqueologicas con tecnicas nucleares

    Energy Technology Data Exchange (ETDEWEB)

    Tenorio, D [Instituto Nacional de Investigaciones Nucleares, A.P. 18-1027, 11801 Mexico D.F. (Mexico)

    2002-07-01

    In this work nuclear techniques such as Neutron Activation Analysis, PIXE, X-ray fluorescence analysis, Metallography, Uranium series, Rutherford Backscattering for using in analysis of archaeological specimens and materials are described. Also some published works and thesis about analysis of different Mexican and Meso american archaeological sites are referred. (Author)

  14. Topological data analysis of financial time series: Landscapes of crashes

    Science.gov (United States)

    Gidea, Marian; Katz, Yuri

    2018-02-01

    We explore the evolution of daily returns of four major US stock market indices during the technology crash of 2000, and the financial crisis of 2007-2009. Our methodology is based on topological data analysis (TDA). We use persistence homology to detect and quantify topological patterns that appear in multidimensional time series. Using a sliding window, we extract time-dependent point cloud data sets, to which we associate a topological space. We detect transient loops that appear in this space, and we measure their persistence. This is encoded in real-valued functions referred to as a 'persistence landscapes'. We quantify the temporal changes in persistence landscapes via their Lp-norms. We test this procedure on multidimensional time series generated by various non-linear and non-equilibrium models. We find that, in the vicinity of financial meltdowns, the Lp-norms exhibit strong growth prior to the primary peak, which ascends during a crash. Remarkably, the average spectral density at low frequencies of the time series of Lp-norms of the persistence landscapes demonstrates a strong rising trend for 250 trading days prior to either dotcom crash on 03/10/2000, or to the Lehman bankruptcy on 09/15/2008. Our study suggests that TDA provides a new type of econometric analysis, which complements the standard statistical measures. The method can be used to detect early warning signals of imminent market crashes. We believe that this approach can be used beyond the analysis of financial time series presented here.

  15. Cluster analysis of activity-time series in motor learning

    DEFF Research Database (Denmark)

    Balslev, Daniela; Nielsen, Finn Å; Futiger, Sally A

    2002-01-01

    Neuroimaging studies of learning focus on brain areas where the activity changes as a function of time. To circumvent the difficult problem of model selection, we used a data-driven analytic tool, cluster analysis, which extracts representative temporal and spatial patterns from the voxel......-time series. The optimal number of clusters was chosen using a cross-validated likelihood method, which highlights the clustering pattern that generalizes best over the subjects. Data were acquired with PET at different time points during practice of a visuomotor task. The results from cluster analysis show...

  16. TIME SERIES ANALYSIS ON STOCK MARKET FOR TEXT MINING CORRELATION OF ECONOMY NEWS

    Directory of Open Access Journals (Sweden)

    Sadi Evren SEKER

    2014-01-01

    Full Text Available This paper proposes an information retrieval methodfor the economy news. Theeffect of economy news, are researched in the wordlevel and stock market valuesare considered as the ground proof.The correlation between stock market prices and economy news is an already ad-dressed problem for most of the countries. The mostwell-known approach is ap-plying the text mining approaches to the news and some time series analysis tech-niques over stock market closing values in order toapply classification or cluster-ing algorithms over the features extracted. This study goes further and tries to askthe question what are the available time series analysis techniques for the stockmarket closing values and which one is the most suitable? In this study, the newsand their dates are collected into a database and text mining is applied over thenews, the text mining part has been kept simple with only term frequency – in-verse document frequency method. For the time series analysis part, we havestudied 10 different methods such as random walk, moving average, acceleration,Bollinger band, price rate of change, periodic average, difference, momentum orrelative strength index and their variation. In this study we have also explainedthese techniques in a comparative way and we have applied the methods overTurkish Stock Market closing values for more than a2 year period. On the otherhand, we have applied the term frequency – inversedocument frequency methodon the economy news of one of the high-circulatingnewspapers in Turkey.

  17. Time Series Analysis, Modeling and Applications A Computational Intelligence Perspective

    CERN Document Server

    Chen, Shyi-Ming

    2013-01-01

    Temporal and spatiotemporal data form an inherent fabric of the society as we are faced with streams of data coming from numerous sensors, data feeds, recordings associated with numerous areas of application embracing physical and human-generated phenomena (environmental data, financial markets, Internet activities, etc.). A quest for a thorough analysis, interpretation, modeling and prediction of time series comes with an ongoing challenge for developing models that are both accurate and user-friendly (interpretable). The volume is aimed to exploit the conceptual and algorithmic framework of Computational Intelligence (CI) to form a cohesive and comprehensive environment for building models of time series. The contributions covered in the volume are fully reflective of the wealth of the CI technologies by bringing together ideas, algorithms, and numeric studies, which convincingly demonstrate their relevance, maturity and visible usefulness. It reflects upon the truly remarkable diversity of methodological a...

  18. Recurrence Density Enhanced Complex Networks for Nonlinear Time Series Analysis

    Science.gov (United States)

    Costa, Diego G. De B.; Reis, Barbara M. Da F.; Zou, Yong; Quiles, Marcos G.; Macau, Elbert E. N.

    We introduce a new method, which is entitled Recurrence Density Enhanced Complex Network (RDE-CN), to properly analyze nonlinear time series. Our method first transforms a recurrence plot into a figure of a reduced number of points yet preserving the main and fundamental recurrence properties of the original plot. This resulting figure is then reinterpreted as a complex network, which is further characterized by network statistical measures. We illustrate the computational power of RDE-CN approach by time series by both the logistic map and experimental fluid flows, which show that our method distinguishes different dynamics sufficiently well as the traditional recurrence analysis. Therefore, the proposed methodology characterizes the recurrence matrix adequately, while using a reduced set of points from the original recurrence plots.

  19. Various Techniques to Increase Keratinized Tissue for Implant Supported Overdentures: Retrospective Case Series

    Directory of Open Access Journals (Sweden)

    Ahmed Elkhaweldi

    2015-01-01

    Full Text Available Purpose. The purpose of this retrospective case series is to describe and compare different surgical techniques that can be utilized to augment the keratinized soft tissue around implant-supported overdentures. Materials and Methods. The data set was extracted as deidentified information from the routine treatment of patients at the Ashman Department of Periodontology and Implant Dentistry at New York University College of Dentistry. Eight edentulous patients were selected to be included in this study. Patients were treated for lack of keratinized tissue prior to implant placement, during the second stage surgery, and after delivery of the final prosthesis. Results. All 8 patients in this study were wearing a complete maxillary and/or mandibular denture for at least a year before the time of the surgery. One of the following surgical techniques was utilized to increase the amount of keratinized tissue: apically positioned flap (APF, pedicle graft (PG, connective tissue graft (CTG, or free gingival graft (FGG. Conclusions. The amount of keratinized tissue should be taken into consideration when planning for implant-supported overdentures. The apical repositioning flap is an effective approach to increase the width of keratinized tissue prior to the implant placement.

  20. Sensitivity analysis of hybrid thermoelastic techniques

    Science.gov (United States)

    W.A. Samad; J.M. Considine

    2017-01-01

    Stress functions have been used as a complementary tool to support experimental techniques, such as thermoelastic stress analysis (TSA) and digital image correlation (DIC), in an effort to evaluate the complete and separate full-field stresses of loaded structures. The need for such coupling between experimental data and stress functions is due to the fact that...

  1. Multivariate stochastic analysis for Monthly hydrological time series at Cuyahoga River Basin

    Science.gov (United States)

    zhang, L.

    2011-12-01

    Copula has become a very powerful statistic and stochastic methodology in case of the multivariate analysis in Environmental and Water resources Engineering. In recent years, the popular one-parameter Archimedean copulas, e.g. Gumbel-Houggard copula, Cook-Johnson copula, Frank copula, the meta-elliptical copula, e.g. Gaussian Copula, Student-T copula, etc. have been applied in multivariate hydrological analyses, e.g. multivariate rainfall (rainfall intensity, duration and depth), flood (peak discharge, duration and volume), and drought analyses (drought length, mean and minimum SPI values, and drought mean areal extent). Copula has also been applied in the flood frequency analysis at the confluences of river systems by taking into account the dependence among upstream gauge stations rather than by using the hydrological routing technique. In most of the studies above, the annual time series have been considered as stationary signal which the time series have been assumed as independent identically distributed (i.i.d.) random variables. But in reality, hydrological time series, especially the daily and monthly hydrological time series, cannot be considered as i.i.d. random variables due to the periodicity existed in the data structure. Also, the stationary assumption is also under question due to the Climate Change and Land Use and Land Cover (LULC) change in the fast years. To this end, it is necessary to revaluate the classic approach for the study of hydrological time series by relaxing the stationary assumption by the use of nonstationary approach. Also as to the study of the dependence structure for the hydrological time series, the assumption of same type of univariate distribution also needs to be relaxed by adopting the copula theory. In this paper, the univariate monthly hydrological time series will be studied through the nonstationary time series analysis approach. The dependence structure of the multivariate monthly hydrological time series will be

  2. Microextraction sample preparation techniques in biomedical analysis.

    Science.gov (United States)

    Szultka, Malgorzata; Pomastowski, Pawel; Railean-Plugaru, Viorica; Buszewski, Boguslaw

    2014-11-01

    Biologically active compounds are found in biological samples at relatively low concentration levels. The sample preparation of target compounds from biological, pharmaceutical, environmental, and food matrices is one of the most time-consuming steps in the analytical procedure. The microextraction techniques are dominant. Metabolomic studies also require application of proper analytical technique for the determination of endogenic metabolites present in biological matrix on trace concentration levels. Due to the reproducibility of data, precision, relatively low cost of the appropriate analysis, simplicity of the determination, and the possibility of direct combination of those techniques with other methods (combination types on-line and off-line), they have become the most widespread in routine determinations. Additionally, sample pretreatment procedures have to be more selective, cheap, quick, and environmentally friendly. This review summarizes the current achievements and applications of microextraction techniques. The main aim is to deal with the utilization of different types of sorbents for microextraction and emphasize the use of new synthesized sorbents as well as to bring together studies concerning the systematic approach to method development. This review is dedicated to the description of microextraction techniques and their application in biomedical analysis. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. CRDM motion analysis using machine learning technique

    International Nuclear Information System (INIS)

    Nishimura, Takuya; Nakayama, Hiroyuki; Saitoh, Mayumi; Yaguchi, Seiji

    2017-01-01

    Magnetic jack type Control Rod Drive Mechanism (CRDM) for pressurized water reactor (PWR) plant operates control rods in response to electrical signals from a reactor control system. CRDM operability is evaluated by quantifying armature's response of closed/opened time which means interval time between coil energizing/de-energizing points and armature closed/opened points. MHI has already developed an automatic CRDM motion analysis and applied it to actual plants so far. However, CRDM operational data has wide variation depending on their characteristics such as plant condition, plant, etc. In the existing motion analysis, there is an issue of analysis accuracy for applying a single analysis technique to all plant conditions, plants, etc. In this study, MHI investigated motion analysis using machine learning (Random Forests) which is flexibly accommodated to CRDM operational data with wide variation, and is improved analysis accuracy. (author)

  4. Regression and regression analysis time series prediction modeling on climate data of quetta, pakistan

    International Nuclear Information System (INIS)

    Jafri, Y.Z.; Kamal, L.

    2007-01-01

    Various statistical techniques was used on five-year data from 1998-2002 of average humidity, rainfall, maximum and minimum temperatures, respectively. The relationships to regression analysis time series (RATS) were developed for determining the overall trend of these climate parameters on the basis of which forecast models can be corrected and modified. We computed the coefficient of determination as a measure of goodness of fit, to our polynomial regression analysis time series (PRATS). The correlation to multiple linear regression (MLR) and multiple linear regression analysis time series (MLRATS) were also developed for deciphering the interdependence of weather parameters. Spearman's rand correlation and Goldfeld-Quandt test were used to check the uniformity or non-uniformity of variances in our fit to polynomial regression (PR). The Breusch-Pagan test was applied to MLR and MLRATS, respectively which yielded homoscedasticity. We also employed Bartlett's test for homogeneity of variances on a five-year data of rainfall and humidity, respectively which showed that the variances in rainfall data were not homogenous while in case of humidity, were homogenous. Our results on regression and regression analysis time series show the best fit to prediction modeling on climatic data of Quetta, Pakistan. (author)

  5. PHOTOGRAMMETRIC TECHNIQUES FOR ROAD SURFACE ANALYSIS

    Directory of Open Access Journals (Sweden)

    V. A. Knyaz

    2016-06-01

    Full Text Available The quality and condition of a road surface is of great importance for convenience and safety of driving. So the investigations of the behaviour of road materials in laboratory conditions and monitoring of existing roads are widely fulfilled for controlling a geometric parameters and detecting defects in the road surface. Photogrammetry as accurate non-contact measuring method provides powerful means for solving different tasks in road surface reconstruction and analysis. The range of dimensions concerned in road surface analysis can have great variation from tenths of millimetre to hundreds meters and more. So a set of techniques is needed to meet all requirements of road parameters estimation. Two photogrammetric techniques for road surface analysis are presented: for accurate measuring of road pavement and for road surface reconstruction based on imagery obtained from unmanned aerial vehicle. The first technique uses photogrammetric system based on structured light for fast and accurate surface 3D reconstruction and it allows analysing the characteristics of road texture and monitoring the pavement behaviour. The second technique provides dense 3D model road suitable for road macro parameters estimation.

  6. A time series deformation estimation in the NW Himalayas using SBAS InSAR technique

    Science.gov (United States)

    Kumar, V.; Venkataraman, G.

    2012-12-01

    A time series land deformation studies in north western Himalayan region has been presented in this study. Synthetic aperture radar (SAR) interferometry (InSAR) is an important tool for measuring the land displacement caused by different geological processes [1]. Frequent spatial and temporal decorrelation in the Himalayan region is a strong impediment in precise deformation estimation using conventional interferometric SAR approach. In such cases, advanced DInSAR approaches PSInSAR as well as Small base line subset (SBAS) can be used to estimate earth surface deformation. The SBAS technique [2] is a DInSAR approach which uses a twelve or more number of repeat SAR acquisitions in different combinations of a properly chosen data (subsets) for generation of DInSAR interferograms using two pass interferometric approach. Finally it leads to the generation of mean deformation velocity maps and displacement time series. Herein, SBAS algorithm has been used for time series deformation estimation in the NW Himalayan region. ENVISAT ASAR IS2 swath data from 2003 to 2008 have been used for quantifying slow deformation. Himalayan region is a very active tectonic belt and active orogeny play a significant role in land deformation process [3]. Geomorphology in the region is unique and reacts to the climate change adversely bringing with land slides and subsidence. Settlements on the hill slopes are prone to land slides, landslips, rockslides and soil creep. These hazardous features have hampered the over all progress of the region as they obstruct the roads and flow of traffic, break communication, block flowing water in stream and create temporary reservoirs and also bring down lot of soil cover and thus add enormous silt and gravel to the streams. It has been observed that average deformation varies from -30.0 mm/year to 10 mm/year in the NW Himalayan region . References [1] Massonnet, D., Feigl, K.L.,Rossi, M. and Adragna, F. (1994) Radar interferometry mapping of

  7. Interpolation techniques used for data quality control and calculation of technical series: an example of a Central European daily time series

    Czech Academy of Sciences Publication Activity Database

    Štěpánek, P.; Zahradníček, P.; Huth, Radan

    2011-01-01

    Roč. 115, 1-2 (2011), s. 87-98 ISSN 0324-6329 R&D Projects: GA ČR GA205/08/1619 Institutional research plan: CEZ:AV0Z30420517 Keywords : data quality control * filling missing values * interpolation techniques * climatological time series Subject RIV: DG - Athmosphere Sciences, Meteorology Impact factor: 0.364, year: 2011 http://www.met.hu/en/ismeret-tar/kiadvanyok/idojaras/index.php?id=34

  8. Diffraction analysis of customized illumination technique

    Science.gov (United States)

    Lim, Chang-Moon; Kim, Seo-Min; Eom, Tae-Seung; Moon, Seung Chan; Shin, Ki S.

    2004-05-01

    Various enhancement techniques such as alternating PSM, chrome-less phase lithography, double exposure, etc. have been considered as driving forces to lead the production k1 factor towards below 0.35. Among them, a layer specific optimization of illumination mode, so-called customized illumination technique receives deep attentions from lithographers recently. A new approach for illumination customization based on diffraction spectrum analysis is suggested in this paper. Illumination pupil is divided into various diffraction domains by comparing the similarity of the confined diffraction spectrum. Singular imaging property of individual diffraction domain makes it easier to build and understand the customized illumination shape. By comparing the goodness of image in each domain, it was possible to achieve the customized shape of illumination. With the help from this technique, it was found that the layout change would not gives the change in the shape of customized illumination mode.

  9. Fault tree analysis: concepts and techniques

    International Nuclear Information System (INIS)

    Fussell, J.B.

    1976-01-01

    Concepts and techniques of fault tree analysis have been developed over the past decade and now predictions from this type analysis are important considerations in the design of many systems such as aircraft, ships and their electronic systems, missiles, and nuclear reactor systems. Routine, hardware-oriented fault tree construction can be automated; however, considerable effort is needed in this area to get the methodology into production status. When this status is achieved, the entire analysis of hardware systems will be automated except for the system definition step. Automated analysis is not undesirable; to the contrary, when verified on adequately complex systems, automated analysis could well become a routine analysis. It could also provide an excellent start for a more in-depth fault tree analysis that includes environmental effects, common mode failure, and human errors. The automated analysis is extremely fast and frees the analyst from the routine hardware-oriented fault tree construction, as well as eliminates logic errors and errors of oversight in this part of the analysis. Automated analysis then affords the analyst a powerful tool to allow his prime efforts to be devoted to unearthing more subtle aspects of the modes of failure of the system

  10. Multivariate Time Series Forecasting of Crude Palm Oil Price Using Machine Learning Techniques

    Science.gov (United States)

    Kanchymalay, Kasturi; Salim, N.; Sukprasert, Anupong; Krishnan, Ramesh; Raba'ah Hashim, Ummi

    2017-08-01

    The aim of this paper was to study the correlation between crude palm oil (CPO) price, selected vegetable oil prices (such as soybean oil, coconut oil, and olive oil, rapeseed oil and sunflower oil), crude oil and the monthly exchange rate. Comparative analysis was then performed on CPO price forecasting results using the machine learning techniques. Monthly CPO prices, selected vegetable oil prices, crude oil prices and monthly exchange rate data from January 1987 to February 2017 were utilized. Preliminary analysis showed a positive and high correlation between the CPO price and soy bean oil price and also between CPO price and crude oil price. Experiments were conducted using multi-layer perception, support vector regression and Holt Winter exponential smoothing techniques. The results were assessed by using criteria of root mean square error (RMSE), means absolute error (MAE), means absolute percentage error (MAPE) and Direction of accuracy (DA). Among these three techniques, support vector regression(SVR) with Sequential minimal optimization (SMO) algorithm showed relatively better results compared to multi-layer perceptron and Holt Winters exponential smoothing method.

  11. Applications of neutron activation analysis technique

    International Nuclear Information System (INIS)

    Jonah, S. A.

    2000-07-01

    The technique was developed as far back as 1936 by G. Hevesy and H. Levy for the analysis of Dy using an isotopic source. Approximately 40 elements can be analyzed by instrumental neutron activation analysis (INNA) technique with neutrons from a nuclear reactor. By applying radiochemical separation, the number of elements that can be analysed may be increased to almost 70. Compared with other analytical methods used in environmental and industrial research, NAA has some unique features. These are multi-element capability, rapidity, reproducibility of results, complementarity to other methods, freedom from analytical blank and independency of chemical state of elements. There are several types of neutron sources namely: nuclear reactors, accelerator-based and radioisotope-based sources, but nuclear reactors with high fluxes of neutrons from the fission of 235 U give the most intense irradiation, and hence the highest available sensitivities for NAA. In this paper, the applications of NAA of socio-economic importance are discussed. The benefits of using NAA and related nuclear techniques for on-line applications in industrial process control are highlighted. A brief description of the NAA set-ups at CERT is enumerated. Finally, NAA is compared with other leading analytical techniques

  12. Chromatographic Techniques for Rare Earth Elements Analysis

    Science.gov (United States)

    Chen, Beibei; He, Man; Zhang, Huashan; Jiang, Zucheng; Hu, Bin

    2017-04-01

    The present capability of rare earth element (REE) analysis has been achieved by the development of two instrumental techniques. The efficiency of spectroscopic methods was extraordinarily improved for the detection and determination of REE traces in various materials. On the other hand, the determination of REEs very often depends on the preconcentration and separation of REEs, and chromatographic techniques are very powerful tools for the separation of REEs. By coupling with sensitive detectors, many ambitious analytical tasks can be fulfilled. Liquid chromatography is the most widely used technique. Different combinations of stationary phases and mobile phases could be used in ion exchange chromatography, ion chromatography, ion-pair reverse-phase chromatography and some other techniques. The application of gas chromatography is limited because only volatile compounds of REEs can be separated. Thin-layer and paper chromatography are techniques that cannot be directly coupled with suitable detectors, which limit their applications. For special demands, separations can be performed by capillary electrophoresis, which has very high separation efficiency.

  13. Avoid Filling Swiss Cheese with Whipped Cream; Imputation Techniques and Evaluation Procedures for Cross-Country Time Series

    OpenAIRE

    Michael Weber; Michaela Denk

    2011-01-01

    International organizations collect data from national authorities to create multivariate cross-sectional time series for their analyses. As data from countries with not yet well-established statistical systems may be incomplete, the bridging of data gaps is a crucial challenge. This paper investigates data structures and missing data patterns in the cross-sectional time series framework, reviews missing value imputation techniques used for micro data in official statistics, and discusses the...

  14. Time series clustering analysis of health-promoting behavior

    Science.gov (United States)

    Yang, Chi-Ta; Hung, Yu-Shiang; Deng, Guang-Feng

    2013-10-01

    Health promotion must be emphasized to achieve the World Health Organization goal of health for all. Since the global population is aging rapidly, ComCare elder health-promoting service was developed by the Taiwan Institute for Information Industry in 2011. Based on the Pender health promotion model, ComCare service offers five categories of health-promoting functions to address the everyday needs of seniors: nutrition management, social support, exercise management, health responsibility, stress management. To assess the overall ComCare service and to improve understanding of the health-promoting behavior of elders, this study analyzed health-promoting behavioral data automatically collected by the ComCare monitoring system. In the 30638 session records collected for 249 elders from January, 2012 to March, 2013, behavior patterns were identified by fuzzy c-mean time series clustering algorithm combined with autocorrelation-based representation schemes. The analysis showed that time series data for elder health-promoting behavior can be classified into four different clusters. Each type reveals different health-promoting needs, frequencies, function numbers and behaviors. The data analysis result can assist policymakers, health-care providers, and experts in medicine, public health, nursing and psychology and has been provided to Taiwan National Health Insurance Administration to assess the elder health-promoting behavior.

  15. Time series analysis of gold production in Malaysia

    Science.gov (United States)

    Muda, Nora; Hoon, Lee Yuen

    2012-05-01

    Gold is a soft, malleable, bright yellow metallic element and unaffected by air or most reagents. It is highly valued as an asset or investment commodity and is extensively used in jewellery, industrial application, dentistry and medical applications. In Malaysia, gold mining is limited in several areas such as Pahang, Kelantan, Terengganu, Johor and Sarawak. The main purpose of this case study is to obtain a suitable model for the production of gold in Malaysia. The model can also be used to predict the data of Malaysia's gold production in the future. Box-Jenkins time series method was used to perform time series analysis with the following steps: identification, estimation, diagnostic checking and forecasting. In addition, the accuracy of prediction is tested using mean absolute percentage error (MAPE). From the analysis, the ARIMA (3,1,1) model was found to be the best fitted model with MAPE equals to 3.704%, indicating the prediction is very accurate. Hence, this model can be used for forecasting. This study is expected to help the private and public sectors to understand the gold production scenario and later plan the gold mining activities in Malaysia.

  16. Artificial Intelligence techniques for big data analysis

    OpenAIRE

    Aditya Khatri

    2017-01-01

    During my stay in Salamanca (Spain), I was fortunate enough to participate in the BISITE Research Group of the University of Salamanca. The University of Salamanca is the oldest university in Spain and in 2018 it celebrates its 8th centenary. As a computer science researcher, I participated in one of the many international projects that the research group has active, especially in big data analysis using Artificial Intelligence (AI) techniques. AI is one of BISITE's main lines of rese...

  17. Reliability analysis of large scaled structures by optimization technique

    International Nuclear Information System (INIS)

    Ishikawa, N.; Mihara, T.; Iizuka, M.

    1987-01-01

    This paper presents a reliability analysis based on the optimization technique using PNET (Probabilistic Network Evaluation Technique) method for the highly redundant structures having a large number of collapse modes. This approach makes the best use of the merit of the optimization technique in which the idea of PNET method is used. The analytical process involves the minimization of safety index of the representative mode, subjected to satisfaction of the mechanism condition and of the positive external work. The procedure entails the sequential performance of a series of the NLP (Nonlinear Programming) problems, where the correlation condition as the idea of PNET method pertaining to the representative mode is taken as an additional constraint to the next analysis. Upon succeeding iterations, the final analysis is achieved when a collapse probability at the subsequent mode is extremely less than the value at the 1st mode. The approximate collapse probability of the structure is defined as the sum of the collapse probabilities of the representative modes classified by the extent of correlation. Then, in order to confirm the validity of the proposed method, the conventional Monte Carlo simulation is also revised by using the collapse load analysis. Finally, two fairly large structures were analyzed to illustrate the scope and application of the approach. (orig./HP)

  18. Applications Of Binary Image Analysis Techniques

    Science.gov (United States)

    Tropf, H.; Enderle, E.; Kammerer, H. P.

    1983-10-01

    After discussing the conditions where binary image analysis techniques can be used, three new applications of the fast binary image analysis system S.A.M. (Sensorsystem for Automation and Measurement) are reported: (1) The human view direction is measured at TV frame rate while the subject's head is free movable. (2) Industrial parts hanging on a moving conveyor are classified prior to spray painting by robot. (3) In automotive wheel assembly, the eccentricity of the wheel is minimized by turning the tyre relative to the rim in order to balance the eccentricity of the components.

  19. Infusing Reliability Techniques into Software Safety Analysis

    Science.gov (United States)

    Shi, Ying

    2015-01-01

    Software safety analysis for a large software intensive system is always a challenge. Software safety practitioners need to ensure that software related hazards are completely identified, controlled, and tracked. This paper discusses in detail how to incorporate the traditional reliability techniques into the entire software safety analysis process. In addition, this paper addresses how information can be effectively shared between the various practitioners involved in the software safety analyses. The author has successfully applied the approach to several aerospace applications. Examples are provided to illustrate the key steps of the proposed approach.

  20. Predicting the Market Potential Using Time Series Analysis

    Directory of Open Access Journals (Sweden)

    Halmet Bradosti

    2015-12-01

    Full Text Available The aim of this analysis is to forecast a mini-market sales volume for the period of twelve months starting August 2015 to August 2016. The study is based on the monthly sales in Iraqi Dinar for a private local mini-market for the month of April 2014 to July 2015. As revealed on the graph and of course if the stagnant economic condition continues, the trend of future sales is down-warding. Based on time series analysis, the business may continue to operate and generate small revenues until August 2016. However, due to low sales volume, low profit margin and operating expenses, the revenues may not be adequate enough to produce positive net income and the business may not be able to operate afterward. The principal question rose from this is the forecasting sales in the region will be difficult where the business cycle so dynamic and revolutionary due to systematic risks and unforeseeable future.

  1. The development of human behavior analysis techniques

    International Nuclear Information System (INIS)

    Lee, Jung Woon; Lee, Yong Hee; Park, Geun Ok; Cheon, Se Woo; Suh, Sang Moon; Oh, In Suk; Lee, Hyun Chul; Park, Jae Chang.

    1997-07-01

    In this project, which is to study on man-machine interaction in Korean nuclear power plants, we developed SACOM (Simulation Analyzer with a Cognitive Operator Model), a tool for the assessment of task performance in the control rooms using software simulation, and also develop human error analysis and application techniques. SACOM was developed to assess operator's physical workload, workload in information navigation at VDU workstations, and cognitive workload in procedural tasks. We developed trip analysis system including a procedure based on man-machine interaction analysis system including a procedure based on man-machine interaction analysis and a classification system. We analyzed a total of 277 trips occurred from 1978 to 1994 to produce trip summary information, and for 79 cases induced by human errors time-lined man-machine interactions. The INSTEC, a database system of our analysis results, was developed. The MARSTEC, a multimedia authoring and representation system for trip information, was also developed, and techniques for human error detection in human factors experiments were established. (author). 121 refs., 38 tabs., 52 figs

  2. The development of human behavior analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jung Woon; Lee, Yong Hee; Park, Geun Ok; Cheon, Se Woo; Suh, Sang Moon; Oh, In Suk; Lee, Hyun Chul; Park, Jae Chang

    1997-07-01

    In this project, which is to study on man-machine interaction in Korean nuclear power plants, we developed SACOM (Simulation Analyzer with a Cognitive Operator Model), a tool for the assessment of task performance in the control rooms using software simulation, and also develop human error analysis and application techniques. SACOM was developed to assess operator`s physical workload, workload in information navigation at VDU workstations, and cognitive workload in procedural tasks. We developed trip analysis system including a procedure based on man-machine interaction analysis system including a procedure based on man-machine interaction analysis and a classification system. We analyzed a total of 277 trips occurred from 1978 to 1994 to produce trip summary information, and for 79 cases induced by human errors time-lined man-machine interactions. The INSTEC, a database system of our analysis results, was developed. The MARSTEC, a multimedia authoring and representation system for trip information, was also developed, and techniques for human error detection in human factors experiments were established. (author). 121 refs., 38 tabs., 52 figs.

  3. Industrial electricity demand for Turkey: A structural time series analysis

    International Nuclear Information System (INIS)

    Dilaver, Zafer; Hunt, Lester C.

    2011-01-01

    This research investigates the relationship between Turkish industrial electricity consumption, industrial value added and electricity prices in order to forecast future Turkish industrial electricity demand. To achieve this, an industrial electricity demand function for Turkey is estimated by applying the structural time series technique to annual data over the period 1960 to 2008. In addition to identifying the size and significance of the price and industrial value added (output) elasticities, this technique also uncovers the electricity Underlying Energy Demand Trend (UEDT) for the Turkish industrial sector and is, as far as is known, the first attempt to do this. The results suggest that output and real electricity prices and a UEDT all have an important role to play in driving Turkish industrial electricity demand. Consequently, they should all be incorporated when modelling Turkish industrial electricity demand and the estimated UEDT should arguably be considered in future energy policy decisions concerning the Turkish electricity industry. The output and price elasticities are estimated to be 0.15 and - 0.16 respectively, with an increasing (but at a decreasing rate) UEDT and based on the estimated equation, and different forecast assumptions, it is predicted that Turkish industrial electricity demand will be somewhere between 97 and 148 TWh by 2020. -- Research Highlights: → Estimated output and price elasticities of 0.15 and -0.16 respectively. → Estimated upward sloping UEDT (i.e. energy using) but at a decreasing rate. → Predicted Turkish industrial electricity demand between 97 and 148 TWh in 2020.

  4. Adventures in Modern Time Series Analysis: From the Sun to the Crab Nebula and Beyond

    Science.gov (United States)

    Scargle, Jeffrey

    2014-01-01

    With the generation of long, precise, and finely sampled time series the Age of Digital Astronomy is uncovering and elucidating energetic dynamical processes throughout the Universe. Fulfilling these opportunities requires data effective analysis techniques rapidly and automatically implementing advanced concepts. The Time Series Explorer, under development in collaboration with Tom Loredo, provides tools ranging from simple but optimal histograms to time and frequency domain analysis for arbitrary data modes with any time sampling. Much of this development owes its existence to Joe Bredekamp and the encouragement he provided over several decades. Sample results for solar chromospheric activity, gamma-ray activity in the Crab Nebula, active galactic nuclei and gamma-ray bursts will be displayed.

  5. Social network analysis of character interaction in the Stargate and Star Trek television series

    Science.gov (United States)

    Tan, Melody Shi Ai; Ujum, Ephrance Abu; Ratnavelu, Kuru

    This paper undertakes a social network analysis of two science fiction television series, Stargate and Star Trek. Television series convey stories in the form of character interaction, which can be represented as “character networks”. We connect each pair of characters that exchanged spoken dialogue in any given scene demarcated in the television series transcripts. These networks are then used to characterize the overall structure and topology of each series. We find that the character networks of both series have similar structure and topology to that found in previous work on mythological and fictional networks. The character networks exhibit the small-world effects but found no significant support for power-law. Since the progression of an episode depends to a large extent on the interaction between each of its characters, the underlying network structure tells us something about the complexity of that episode’s storyline. We assessed the complexity using techniques from spectral graph theory. We found that the episode networks are structured either as (1) closed networks, (2) those containing bottlenecks that connect otherwise disconnected clusters or (3) a mixture of both.

  6. Land Subsidence Monitoring by InSAR Time Series Technique Derived From ALOS-2 PALSAR-2 over Surabaya City, Indonesia

    Science.gov (United States)

    Aditiya, A.; Takeuchi, W.; Aoki, Y.

    2017-12-01

    Surabaya is the second largest city in Indonesia and the capital of East Java Province with rapid population and industrialization. The impact of urbanization in the big city can suffer potential disasters either nature or anthropogenic such as land subsidence and flood. The pattern of land subsidence need to be mapped for the purposes of planning and structuring the city as well as taking appropriate policy in anticipating and mitigating the impact. This research has used interferometric Synthetic Aperture Radar (InSAR) Small Baseline Subset (SBAS) technique and applied time series analysis to investigate land subsidence occured. The technique includes the process of focusing the SAR data, incorporating the precise orbit, generating interferogram and phase unwrapping using SNAPHU algorithms. The results showed land subsidence has been detected during 2014-2017 over Surabaya city area using ALOS-2/PALSAR-2 images data. These results reveal the subsidence has observed in several area in Surabaya in particular northern part reach up to ∼2 cm/year. The fastest subsidence occurs in highly populated areas suffer vulnerable to flooding and sea level rise impact. In urban areas we found a correlation between land subsidence with residential or industrial land use. It concludes that land subsidence is mainly caused by ground water consumption for industrial and residential use respectively.

  7. A new analysis technique for microsamples

    International Nuclear Information System (INIS)

    Boyer, R.; Journoux, J.P.; Duval, C.

    1989-01-01

    For many decades, isotopic analysis of Uranium or Plutonium has been performed by mass spectrometry. The most recent analytical techniques, using the counting method or a plasma torch combined with a mass spectrometer (ICP.MS) have not yet to reach a greater degree of precision than the older methods in this field. The two means of ionization for isotopic analysis - by electronic bombardment of atoms or molecules (source of gas ions) and - by thermal effect (thermoionic source) are compared revealing some inconsistency between the quantity of sample necessary for analysis and the luminosity. In fact, the quantity of sample necessary for the gas source mass spectrometer is 10 to 20 times greater than that for the thermoionization spectrometer, while the sample consumption is between 10 5 to 10 6 times greater. This proves that almost the entire sample is not necessary for the measurement; it is only required because of the system of introduction for the gas spectrometer. The new analysis technique referred to as ''Microfluorination'' corrects this anomaly and exploits the advantages of the electron bombardment method of ionization

  8. Flash Infrared Thermography Contrast Data Analysis Technique

    Science.gov (United States)

    Koshti, Ajay

    2014-01-01

    This paper provides information on an IR Contrast technique that involves extracting normalized contrast versus time evolutions from the flash thermography inspection infrared video data. The analysis calculates thermal measurement features from the contrast evolution. In addition, simulation of the contrast evolution is achieved through calibration on measured contrast evolutions from many flat-bottom holes in the subject material. The measurement features and the contrast simulation are used to evaluate flash thermography data in order to characterize delamination-like anomalies. The thermal measurement features relate to the anomaly characteristics. The contrast evolution simulation is matched to the measured contrast evolution over an anomaly to provide an assessment of the anomaly depth and width which correspond to the depth and diameter of the equivalent flat-bottom hole (EFBH) similar to that used as input to the simulation. A similar analysis, in terms of diameter and depth of an equivalent uniform gap (EUG) providing a best match with the measured contrast evolution, is also provided. An edge detection technique called the half-max is used to measure width and length of the anomaly. Results of the half-max width and the EFBH/EUG diameter are compared to evaluate the anomaly. The information provided here is geared towards explaining the IR Contrast technique. Results from a limited amount of validation data on reinforced carbon-carbon (RCC) hardware are included in this paper.

  9. Time series analysis of brain regional volume by MR image

    International Nuclear Information System (INIS)

    Tanaka, Mika; Tarusawa, Ayaka; Nihei, Mitsuyo; Fukami, Tadanori; Yuasa, Tetsuya; Wu, Jin; Ishiwata, Kiichi; Ishii, Kenji

    2010-01-01

    The present study proposed a methodology of time series analysis of volumes of frontal, parietal, temporal and occipital lobes and cerebellum because such volumetric reports along the process of individual's aging have been scarcely presented. Subjects analyzed were brain images of 2 healthy males and 18 females of av. age of 69.0 y, of which T1-weighted 3D SPGR (spoiled gradient recalled in the steady state) acquisitions with a GE SIGNA EXCITE HD 1.5T machine were conducted for 4 times in the time series of 42-50 months. The image size was 256 x 256 x (86-124) voxels with digitization level 16 bits. As the template for the regions, the standard gray matter atlas (icbn452 a tlas p robability g ray) and its labeled one (icbn.Labels), provided by UCLA Laboratory of Neuro Imaging, were used for individual's standardization. Segmentation, normalization and coregistration were performed with the MR imaging software SPM8 (Statistic Parametric Mapping 8). Volumes of regions were calculated as their voxel ratio to the whole brain voxel in percent. It was found that the regional volumes decreased with aging in all above lobes examined and cerebellum in average percent per year of -0.11, -0.07, -0.04, -0.02, and -0.03, respectively. The procedure for calculation of the regional volumes, which has been manually operated hitherto, can be automatically conducted for the individual brain using the standard atlases above. (T.T.)

  10. A non linear analysis of human gait time series based on multifractal analysis and cross correlations

    International Nuclear Information System (INIS)

    Munoz-Diosdado, A

    2005-01-01

    We analyzed databases with gait time series of adults and persons with Parkinson, Huntington and amyotrophic lateral sclerosis (ALS) diseases. We obtained the staircase graphs of accumulated events that can be bounded by a straight line whose slope can be used to distinguish between gait time series from healthy and ill persons. The global Hurst exponent of these series do not show tendencies, we intend that this is because some gait time series have monofractal behavior and others have multifractal behavior so they cannot be characterized with a single Hurst exponent. We calculated the multifractal spectra, obtained the spectra width and found that the spectra of the healthy young persons are almost monofractal. The spectra of ill persons are wider than the spectra of healthy persons. In opposition to the interbeat time series where the pathology implies loss of multifractality, in the gait time series the multifractal behavior emerges with the pathology. Data were collected from healthy and ill subjects as they walked in a roughly circular path and they have sensors in both feet, so we have one time series for the left foot and other for the right foot. First, we analyzed these time series separately, and then we compared both results, with direct comparison and with a cross correlation analysis. We tried to find differences in both time series that can be used as indicators of equilibrium problems

  11. A non linear analysis of human gait time series based on multifractal analysis and cross correlations

    Energy Technology Data Exchange (ETDEWEB)

    Munoz-Diosdado, A [Department of Mathematics, Unidad Profesional Interdisciplinaria de Biotecnologia, Instituto Politecnico Nacional, Av. Acueducto s/n, 07340, Mexico City (Mexico)

    2005-01-01

    We analyzed databases with gait time series of adults and persons with Parkinson, Huntington and amyotrophic lateral sclerosis (ALS) diseases. We obtained the staircase graphs of accumulated events that can be bounded by a straight line whose slope can be used to distinguish between gait time series from healthy and ill persons. The global Hurst exponent of these series do not show tendencies, we intend that this is because some gait time series have monofractal behavior and others have multifractal behavior so they cannot be characterized with a single Hurst exponent. We calculated the multifractal spectra, obtained the spectra width and found that the spectra of the healthy young persons are almost monofractal. The spectra of ill persons are wider than the spectra of healthy persons. In opposition to the interbeat time series where the pathology implies loss of multifractality, in the gait time series the multifractal behavior emerges with the pathology. Data were collected from healthy and ill subjects as they walked in a roughly circular path and they have sensors in both feet, so we have one time series for the left foot and other for the right foot. First, we analyzed these time series separately, and then we compared both results, with direct comparison and with a cross correlation analysis. We tried to find differences in both time series that can be used as indicators of equilibrium problems.

  12. The modified Pirogoff's amputation in treating diabetic foot infections: surgical technique and case series

    Directory of Open Access Journals (Sweden)

    Aziz Nather

    2014-04-01

    Full Text Available Background: This paper describes the surgical technique of a modified Pirogoff's amputation performed by the senior author and reports the results of this operation in a single surgeon case series for patients with diabetic foot infections. Methods: Six patients with diabetic foot infections were operated on by the National University Hospital (NUH diabetic foot team in Singapore between November 2011 and January 2012. All patients underwent a modified Pirogoff's amputation for diabetic foot infections. Inclusion criteria included the presence of a palpable posterior tibial pulse, ankle brachial index (ABI of more than 0.7, and distal infections not extending proximally beyond the midfoot level. Clinical parameters such as presence of pulses and ABI were recorded. Preoperative blood tests performed included a glycated hemoglobin level, hemoglobin, total white blood cell count, C-reactive protein, erythrocyte sedimentation rate, albumin, and creatinine levels. All patients were subjected to 14 sessions of hyperbaric oxygen therapy postoperatively and were followed up for a minimum of 10 months. Results: All six patients had good wound healing. Tibio-calcaneal arthrodesis of the stump was achieved in all cases by 6 months postoperatively. All patients were able to walk with the prosthesis. Conclusions: The modified Pirogoff's amputation has been found to show good results in carefully selected patients with diabetic foot infections. The selection criteria included a palpable posterior tibial pulse, distal infections not extending proximally beyond the midfoot level, ABI of more than 0.7, hemoglobin level of more than 10 g/dL, and serum albumin level of more than 30 g/L.

  13. Reliability analysis techniques for the design engineer

    International Nuclear Information System (INIS)

    Corran, E.R.; Witt, H.H.

    1982-01-01

    This paper describes a fault tree analysis package that eliminates most of the housekeeping tasks involved in proceeding from the initial construction of a fault tree to the final stage of presenting a reliability analysis in a safety report. It is suitable for designers with relatively little training in reliability analysis and computer operation. Users can rapidly investigate the reliability implications of various options at the design stage and evolve a system which meets specified reliability objectives. Later independent review is thus unlikely to reveal major shortcomings necessitating modification and project delays. The package operates interactively, allowing the user to concentrate on the creative task of developing the system fault tree, which may be modified and displayed graphically. For preliminary analysis, system data can be derived automatically from a generic data bank. As the analysis proceeds, improved estimates of critical failure rates and test and maintenance schedules can be inserted. The technique is applied to the reliability analysis of the recently upgraded HIFAR Containment Isolation System. (author)

  14. Centrality measures in temporal networks with time series analysis

    Science.gov (United States)

    Huang, Qiangjuan; Zhao, Chengli; Zhang, Xue; Wang, Xiaojie; Yi, Dongyun

    2017-05-01

    The study of identifying important nodes in networks has a wide application in different fields. However, the current researches are mostly based on static or aggregated networks. Recently, the increasing attention to networks with time-varying structure promotes the study of node centrality in temporal networks. In this paper, we define a supra-evolution matrix to depict the temporal network structure. With using of the time series analysis, the relationships between different time layers can be learned automatically. Based on the special form of the supra-evolution matrix, the eigenvector centrality calculating problem is turned into the calculation of eigenvectors of several low-dimensional matrices through iteration, which effectively reduces the computational complexity. Experiments are carried out on two real-world temporal networks, Enron email communication network and DBLP co-authorship network, the results of which show that our method is more efficient at discovering the important nodes than the common aggregating method.

  15. Adult Craniopharyngioma: Case Series, Systematic Review, and Meta-Analysis.

    Science.gov (United States)

    Dandurand, Charlotte; Sepehry, Amir Ali; Asadi Lari, Mohammad Hossein; Akagami, Ryojo; Gooderham, Peter

    2017-12-18

    The optimal therapeutic approach for adult craniopharyngioma remains controversial. Some advocate for gross total resection (GTR), while others advocate for subtotal resection followed by adjuvant radiotherapy (STR + XRT). To conduct a systematic review and meta-analysis assessing the rate of recurrence in the follow-up of 3 yr in adult craniopharyngioma stratified by extent of resection and presence of adjuvant radiotherapy. MEDLINE (1946-July 1, 2016) and EMBASE (1980-June 30, 2016) were systematically reviewed. From1975 to 2013, 33 patients were treated with initial surgical resection for adult onset craniopharyngioma at our center and were reviewed for inclusion in this study. Data from 22 patients were available for inclusion as a case series in the systematic review. Eligible studies (n = 21) were identified from the literature in addition to a case series of our institutional experience. Three groups were available for analysis: GTR, STR + XRT, and STR. The rates of recurrence were 17%, 27%, and 45%, respectively. The risk of developing recurrence was significant for GTR vs STR (odds ratio [OR]: 0.24, 95% confidence interval [CI]: 0.15-0.38) and STR + XRT vs STR (OR: 0.20, 95% CI: 0.10-0.41). Risk of recurrence after GTR vs STR + XRT did not reach significance (OR: 0.63, 95% CI: 0.33-1.24, P = .18). This is the first and largest systematic review focusing on the rate of recurrence in adult craniopharyngioma. Although the rates of recurrence are favoring GTR, difference in risk of recurrence did not reach significance. This study provides guidance to clinicians and directions for future research with the need to stratify outcomes per treatment modalities. Copyright © 2017 by the Congress of Neurological Surgeons

  16. Interferogram analysis using the Abel inversion technique

    International Nuclear Information System (INIS)

    Yusof Munajat; Mohamad Kadim Suaidi

    2000-01-01

    High speed and high resolution optical detection system were used to capture the image of acoustic waves propagation. The freeze image in the form of interferogram was analysed to calculate the transient pressure profile of the acoustic waves. The interferogram analysis was based on the fringe shift and the application of the Abel inversion technique. An easier approach was made by mean of using MathCAD program as a tool in the programming; yet powerful enough to make such calculation, plotting and transfer of file. (Author)

  17. Employ Simulation Techniques. Second Edition. Module C-5 of Category C--Instructional Execution. Professional Teacher Education Module Series.

    Science.gov (United States)

    Ohio State Univ., Columbus. National Center for Research in Vocational Education.

    One of a series of performance-based teacher education learning packages focusing upon specific professional competencies of vocational teachers, this learning module deals with employing simulation techniques. It consists of an introduction and four learning experiences. Covered in the first learning experience are various types of simulation…

  18. Financial time series analysis based on effective phase transfer entropy

    Science.gov (United States)

    Yang, Pengbo; Shang, Pengjian; Lin, Aijing

    2017-02-01

    Transfer entropy is a powerful technique which is able to quantify the impact of one dynamic system on another system. In this paper, we propose the effective phase transfer entropy method based on the transfer entropy method. We use simulated data to test the performance of this method, and the experimental results confirm that the proposed approach is capable of detecting the information transfer between the systems. We also explore the relationship between effective phase transfer entropy and some variables, such as data size, coupling strength and noise. The effective phase transfer entropy is positively correlated with the data size and the coupling strength. Even in the presence of a large amount of noise, it can detect the information transfer between systems, and it is very robust to noise. Moreover, this measure is indeed able to accurately estimate the information flow between systems compared with phase transfer entropy. In order to reflect the application of this method in practice, we apply this method to financial time series and gain new insight into the interactions between systems. It is demonstrated that the effective phase transfer entropy can be used to detect some economic fluctuations in the financial market. To summarize, the effective phase transfer entropy method is a very efficient tool to estimate the information flow between systems.

  19. Modeling activity patterns of wildlife using time-series analysis.

    Science.gov (United States)

    Zhang, Jindong; Hull, Vanessa; Ouyang, Zhiyun; He, Liang; Connor, Thomas; Yang, Hongbo; Huang, Jinyan; Zhou, Shiqiang; Zhang, Zejun; Zhou, Caiquan; Zhang, Hemin; Liu, Jianguo

    2017-04-01

    The study of wildlife activity patterns is an effective approach to understanding fundamental ecological and evolutionary processes. However, traditional statistical approaches used to conduct quantitative analysis have thus far had limited success in revealing underlying mechanisms driving activity patterns. Here, we combine wavelet analysis, a type of frequency-based time-series analysis, with high-resolution activity data from accelerometers embedded in GPS collars to explore the effects of internal states (e.g., pregnancy) and external factors (e.g., seasonal dynamics of resources and weather) on activity patterns of the endangered giant panda ( Ailuropoda melanoleuca ). Giant pandas exhibited higher frequency cycles during the winter when resources (e.g., water and forage) were relatively poor, as well as during spring, which includes the giant panda's mating season. During the summer and autumn when resources were abundant, pandas exhibited a regular activity pattern with activity peaks every 24 hr. A pregnant individual showed distinct differences in her activity pattern from other giant pandas for several months following parturition. These results indicate that animals adjust activity cycles to adapt to seasonal variation of the resources and unique physiological periods. Wavelet coherency analysis also verified the synchronization of giant panda activity level with air temperature and solar radiation at the 24-hr band. Our study also shows that wavelet analysis is an effective tool for analyzing high-resolution activity pattern data and its relationship to internal and external states, an approach that has the potential to inform wildlife conservation and management across species.

  20. The role of alternative (advanced) conscious sedation techniques in dentistry for adult patients: a series of cases.

    Science.gov (United States)

    Robb, N

    2014-03-01

    The basic techniques of conscious sedation have been found to be safe and effective for the management of anxiety in adult dental patients requiring sedation to allow them to undergo dental treatment. There remains great debate within the profession as to the role of the so called advanced sedation techniques. This paper presents a series of nine patients who were managed with advanced sedation techniques where the basic techniques were either inappropriate or had previously failed to provide adequate relief of anxiety. In these cases, had there not been the availability of advanced sedation techniques, the most likely recourse would have been general anaesthesia--a treatment modality that current guidance indicates should not be used where there is an appropriate alternative. The sedation techniques used have provided that appropriate alternative management strategy.

  1. Disentangling Time-series Spectra with Gaussian Processes: Applications to Radial Velocity Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Czekala, Ian [Kavli Institute for Particle Astrophysics and Cosmology, Stanford University, Stanford, CA 94305 (United States); Mandel, Kaisey S.; Andrews, Sean M.; Dittmann, Jason A. [Harvard-Smithsonian Center for Astrophysics, 60 Garden Street, Cambridge, MA 02138 (United States); Ghosh, Sujit K. [Department of Statistics, NC State University, 2311 Stinson Drive, Raleigh, NC 27695 (United States); Montet, Benjamin T. [Department of Astronomy and Astrophysics, University of Chicago, 5640 S. Ellis Avenue, Chicago, IL 60637 (United States); Newton, Elisabeth R., E-mail: iczekala@stanford.edu [Massachusetts Institute of Technology, Cambridge, MA 02138 (United States)

    2017-05-01

    Measurements of radial velocity variations from the spectroscopic monitoring of stars and their companions are essential for a broad swath of astrophysics; these measurements provide access to the fundamental physical properties that dictate all phases of stellar evolution and facilitate the quantitative study of planetary systems. The conversion of those measurements into both constraints on the orbital architecture and individual component spectra can be a serious challenge, however, especially for extreme flux ratio systems and observations with relatively low sensitivity. Gaussian processes define sampling distributions of flexible, continuous functions that are well-motivated for modeling stellar spectra, enabling proficient searches for companion lines in time-series spectra. We introduce a new technique for spectral disentangling, where the posterior distributions of the orbital parameters and intrinsic, rest-frame stellar spectra are explored simultaneously without needing to invoke cross-correlation templates. To demonstrate its potential, this technique is deployed on red-optical time-series spectra of the mid-M-dwarf binary LP661-13. We report orbital parameters with improved precision compared to traditional radial velocity analysis and successfully reconstruct the primary and secondary spectra. We discuss potential applications for other stellar and exoplanet radial velocity techniques and extensions to time-variable spectra. The code used in this analysis is freely available as an open-source Python package.

  2. Graphical Data Analysis on the Circle: Wrap-Around Time Series Plots for (Interrupted) Time Series Designs.

    Science.gov (United States)

    Rodgers, Joseph Lee; Beasley, William Howard; Schuelke, Matthew

    2014-01-01

    Many data structures, particularly time series data, are naturally seasonal, cyclical, or otherwise circular. Past graphical methods for time series have focused on linear plots. In this article, we move graphical analysis onto the circle. We focus on 2 particular methods, one old and one new. Rose diagrams are circular histograms and can be produced in several different forms using the RRose software system. In addition, we propose, develop, illustrate, and provide software support for a new circular graphical method, called Wrap-Around Time Series Plots (WATS Plots), which is a graphical method useful to support time series analyses in general but in particular in relation to interrupted time series designs. We illustrate the use of WATS Plots with an interrupted time series design evaluating the effect of the Oklahoma City bombing on birthrates in Oklahoma County during the 10 years surrounding the bombing of the Murrah Building in Oklahoma City. We compare WATS Plots with linear time series representations and overlay them with smoothing and error bands. Each method is shown to have advantages in relation to the other; in our example, the WATS Plots more clearly show the existence and effect size of the fertility differential.

  3. Image Analysis Technique for Material Behavior Evaluation in Civil Structures

    Science.gov (United States)

    Moretti, Michele; Rossi, Gianluca

    2017-01-01

    The article presents a hybrid monitoring technique for the measurement of the deformation field. The goal is to obtain information about crack propagation in existing structures, for the purpose of monitoring their state of health. The measurement technique is based on the capture and analysis of a digital image set. Special markers were used on the surface of the structures that can be removed without damaging existing structures as the historical masonry. The digital image analysis was done using software specifically designed in Matlab to follow the tracking of the markers and determine the evolution of the deformation state. The method can be used in any type of structure but is particularly suitable when it is necessary not to damage the surface of structures. A series of experiments carried out on masonry walls of the Oliverian Museum (Pesaro, Italy) and Palazzo Silvi (Perugia, Italy) have allowed the validation of the procedure elaborated by comparing the results with those derived from traditional measuring techniques. PMID:28773129

  4. Low energy analysis techniques for CUORE

    Energy Technology Data Exchange (ETDEWEB)

    Alduino, C.; Avignone, F.T.; Chott, N.; Creswick, R.J.; Rosenfeld, C.; Wilson, J. [University of South Carolina, Department of Physics and Astronomy, Columbia, SC (United States); Alfonso, K.; Huang, H.Z.; Sakai, M.; Schmidt, J. [University of California, Department of Physics and Astronomy, Los Angeles, CA (United States); Artusa, D.R.; Rusconi, C. [University of South Carolina, Department of Physics and Astronomy, Columbia, SC (United States); INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Azzolini, O.; Camacho, A.; Keppel, G.; Palmieri, V.; Pira, C. [INFN-Laboratori Nazionali di Legnaro, Padua (Italy); Bari, G.; Deninno, M.M. [INFN-Sezione di Bologna, Bologna (Italy); Beeman, J.W. [Lawrence Berkeley National Laboratory, Materials Science Division, Berkeley, CA (United States); Bellini, F.; Cosmelli, C.; Ferroni, F.; Piperno, G. [Sapienza Universita di Roma, Dipartimento di Fisica, Rome (Italy); INFN-Sezione di Roma, Rome (Italy); Benato, G.; Singh, V. [University of California, Department of Physics, Berkeley, CA (United States); Bersani, A.; Caminata, A. [INFN-Sezione di Genova, Genoa (Italy); Biassoni, M.; Brofferio, C.; Capelli, S.; Carniti, P.; Cassina, L.; Chiesa, D.; Clemenza, M.; Faverzani, M.; Fiorini, E.; Gironi, L.; Gotti, C.; Maino, M.; Nastasi, M.; Nucciotti, A.; Pavan, M.; Pozzi, S.; Sisti, M.; Terranova, F.; Zanotti, L. [Universita di Milano-Bicocca, Dipartimento di Fisica, Milan (Italy); INFN-Sezione di Milano Bicocca, Milan (Italy); Branca, A.; Taffarello, L. [INFN-Sezione di Padova, Padua (Italy); Bucci, C.; Cappelli, L.; D' Addabbo, A.; Gorla, P.; Pattavina, L.; Pirro, S. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Canonica, L. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Massachusetts Institute of Technology, Cambridge, MA (United States); Cao, X.G.; Fang, D.Q.; Ma, Y.G.; Wang, H.W.; Zhang, G.Q. [Shanghai Institute of Applied Physics, Chinese Academy of Sciences, Shanghai (China); Cardani, L.; Casali, N.; Dafinei, I.; Morganti, S.; Mosteiro, P.J.; Tomei, C.; Vignati, M. [INFN-Sezione di Roma, Rome (Italy); Copello, S.; Di Domizio, S.; Marini, L.; Pallavicini, M. [INFN-Sezione di Genova, Genoa (Italy); Universita di Genova, Dipartimento di Fisica, Genoa (Italy); Cremonesi, O.; Ferri, E.; Giachero, A.; Pessina, G.; Previtali, E. [INFN-Sezione di Milano Bicocca, Milan (Italy); Cushman, J.S.; Davis, C.J.; Heeger, K.M.; Lim, K.E.; Maruyama, R.H. [Yale University, Department of Physics, New Haven, CT (United States); D' Aguanno, D.; Pagliarone, C.E. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Universita degli Studi di Cassino e del Lazio Meridionale, Dipartimento di Ingegneria Civile e Meccanica, Cassino (Italy); Dell' Oro, S. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); INFN-Gran Sasso Science Institute, L' Aquila (Italy); Di Vacri, M.L.; Santone, D. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Universita dell' Aquila, Dipartimento di Scienze Fisiche e Chimiche, L' Aquila (Italy); Drobizhev, A.; Hennings-Yeomans, R.; Kolomensky, Yu.G.; Wagaarachchi, S.L. [University of California, Department of Physics, Berkeley, CA (United States); Lawrence Berkeley National Laboratory, Nuclear Science Division, Berkeley, CA (United States); Franceschi, M.A.; Ligi, C.; Napolitano, T. [INFN-Laboratori Nazionali di Frascati, Rome (Italy); Freedman, S.J. [University of California, Department of Physics, Berkeley, CA (United States); Lawrence Berkeley National Laboratory, Nuclear Science Division, Berkeley, CA (United States); Fujikawa, B.K.; Mei, Y.; Schmidt, B.; Smith, A.R.; Welliver, B. [Lawrence Berkeley National Laboratory, Nuclear Science Division, Berkeley, CA (United States); Giuliani, A.; Novati, V. [Universite Paris-Saclay, CSNSM, Univ. Paris-Sud, CNRS/IN2P3, Orsay (France); Gladstone, L.; Leder, A.; Ouellet, J.L.; Winslow, L.A. [Massachusetts Institute of Technology, Cambridge, MA (United States); Gutierrez, T.D. [California Polytechnic State University, Physics Department, San Luis Obispo, CA (United States); Haller, E.E. [Lawrence Berkeley National Laboratory, Materials Science Division, Berkeley, CA (United States); University of California, Department of Materials Science and Engineering, Berkeley, CA (United States); Han, K. [Shanghai Jiao Tong University, Department of Physics and Astronomy, Shanghai (China); Hansen, E. [University of California, Department of Physics and Astronomy, Los Angeles, CA (United States); Massachusetts Institute of Technology, Cambridge, MA (United States); Kadel, R. [Lawrence Berkeley National Laboratory, Physics Division, Berkeley, CA (United States); Martinez, M. [Sapienza Universita di Roma, Dipartimento di Fisica, Rome (Italy); INFN-Sezione di Roma, Rome (Italy); Universidad de Zaragoza, Laboratorio de Fisica Nuclear y Astroparticulas, Saragossa (Spain); Moggi, N.; Zucchelli, S. [INFN-Sezione di Bologna, Bologna (Italy); Universita di Bologna - Alma Mater Studiorum, Dipartimento di Fisica e Astronomia, Bologna (IT); Nones, C. [CEA/Saclay, Service de Physique des Particules, Gif-sur-Yvette (FR); Norman, E.B.; Wang, B.S. [Lawrence Livermore National Laboratory, Livermore, CA (US); University of California, Department of Nuclear Engineering, Berkeley, CA (US); O' Donnell, T. [Virginia Polytechnic Institute and State University, Center for Neutrino Physics, Blacksburg, VA (US); Sangiorgio, S.; Scielzo, N.D. [Lawrence Livermore National Laboratory, Livermore, CA (US); Wise, T. [Yale University, Department of Physics, New Haven, CT (US); University of Wisconsin, Department of Physics, Madison, WI (US); Woodcraft, A. [University of Edinburgh, SUPA, Institute for Astronomy, Edinburgh (GB); Zimmermann, S. [Lawrence Berkeley National Laboratory, Engineering Division, Berkeley, CA (US)

    2017-12-15

    CUORE is a tonne-scale cryogenic detector operating at the Laboratori Nazionali del Gran Sasso (LNGS) that uses tellurium dioxide bolometers to search for neutrinoless double-beta decay of {sup 130}Te. CUORE is also suitable to search for low energy rare events such as solar axions or WIMP scattering, thanks to its ultra-low background and large target mass. However, to conduct such sensitive searches requires improving the energy threshold to 10 keV. In this paper, we describe the analysis techniques developed for the low energy analysis of CUORE-like detectors, using the data acquired from November 2013 to March 2015 by CUORE-0, a single-tower prototype designed to validate the assembly procedure and new cleaning techniques of CUORE. We explain the energy threshold optimization, continuous monitoring of the trigger efficiency, data and event selection, and energy calibration at low energies in detail. We also present the low energy background spectrum of CUORE-0 below 60 keV. Finally, we report the sensitivity of CUORE to WIMP annual modulation using the CUORE-0 energy threshold and background, as well as an estimate of the uncertainty on the nuclear quenching factor from nuclear recoils in CUORE-0. (orig.)

  5. Machine monitoring via current signature analysis techniques

    International Nuclear Information System (INIS)

    Smith, S.F.; Castleberry, K.N.; Nowlin, C.H.

    1992-01-01

    A significant need in the effort to provide increased production quality is to provide improved plant equipment monitoring capabilities. Unfortunately, in today's tight economy, even such monitoring instrumentation must be implemented in a recognizably cost effective manner. By analyzing the electric current drawn by motors, actuator, and other line-powered industrial equipment, significant insights into the operations of the movers, driven equipment, and even the power source can be obtained. The generic term 'current signature analysis' (CSA) has been coined to describe several techniques for extracting useful equipment or process monitoring information from the electrical power feed system. A patented method developed at Oak Ridge National Laboratory is described which recognizes the presence of line-current modulation produced by motors and actuators driving varying loads. The in-situ application of applicable linear demodulation techniques to the analysis of numerous motor-driven systems is also discussed. The use of high-quality amplitude and angle-demodulation circuitry has permitted remote status monitoring of several types of medium and high-power gas compressors in (US DOE facilities) driven by 3-phase induction motors rated from 100 to 3,500 hp, both with and without intervening speed increasers. Flow characteristics of the compressors, including various forms of abnormal behavior such as surging and rotating stall, produce at the output of the specialized detectors specific time and frequency signatures which can be easily identified for monitoring, control, and fault-prevention purposes. The resultant data are similar in form to information obtained via standard vibration-sensing techniques and can be analyzed using essentially identical methods. In addition, other machinery such as refrigeration compressors, brine pumps, vacuum pumps, fans, and electric motors have been characterized

  6. Use of recurrence plot and recurrence quantification analysis in Taiwan unemployment rate time series

    Science.gov (United States)

    Chen, Wei-Shing

    2011-04-01

    The aim of the article is to answer the question if the Taiwan unemployment rate dynamics is generated by a non-linear deterministic dynamic process. This paper applies a recurrence plot and recurrence quantification approach based on the analysis of non-stationary hidden transition patterns of the unemployment rate of Taiwan. The case study uses the time series data of the Taiwan’s unemployment rate during the period from 1978/01 to 2010/06. The results show that recurrence techniques are able to identify various phases in the evolution of unemployment transition in Taiwan.

  7. Interpretation of engine cycle-to-cycle variation by chaotic time series analysis

    Energy Technology Data Exchange (ETDEWEB)

    Daw, C.S.; Kahl, W.K.

    1990-01-01

    In this paper we summarize preliminary results from applying a new mathematical technique -- chaotic time series analysis (CTSA) -- to cylinder pressure data from a spark-ignition (SI) four-stroke engine fueled with both methanol and iso-octane. Our objective is to look for the presence of deterministic chaos'' dynamics in peak pressure variations and to investigate the potential usefulness of CTSA as a diagnostic tool. Our results suggest that sequential peak cylinder pressures exhibit some characteristic features of deterministic chaos and that CTSA can extract previously unrecognized information from such data. 18 refs., 11 figs., 2 tabs.

  8. Extended local similarity analysis (eLSA) of microbial community and other time series data with replicates.

    Science.gov (United States)

    Xia, Li C; Steele, Joshua A; Cram, Jacob A; Cardon, Zoe G; Simmons, Sheri L; Vallino, Joseph J; Fuhrman, Jed A; Sun, Fengzhu

    2011-01-01

    The increasing availability of time series microbial community data from metagenomics and other molecular biological studies has enabled the analysis of large-scale microbial co-occurrence and association networks. Among the many analytical techniques available, the Local Similarity Analysis (LSA) method is unique in that it captures local and potentially time-delayed co-occurrence and association patterns in time series data that cannot otherwise be identified by ordinary correlation analysis. However LSA, as originally developed, does not consider time series data with replicates, which hinders the full exploitation of available information. With replicates, it is possible to understand the variability of local similarity (LS) score and to obtain its confidence interval. We extended our LSA technique to time series data with replicates and termed it extended LSA, or eLSA. Simulations showed the capability of eLSA to capture subinterval and time-delayed associations. We implemented the eLSA technique into an easy-to-use analytic software package. The software pipeline integrates data normalization, statistical correlation calculation, statistical significance evaluation, and association network construction steps. We applied the eLSA technique to microbial community and gene expression datasets, where unique time-dependent associations were identified. The extended LSA analysis technique was demonstrated to reveal statistically significant local and potentially time-delayed association patterns in replicated time series data beyond that of ordinary correlation analysis. These statistically significant associations can provide insights to the real dynamics of biological systems. The newly designed eLSA software efficiently streamlines the analysis and is freely available from the eLSA homepage, which can be accessed at http://meta.usc.edu/softs/lsa.

  9. Multivariate analysis of remote LIBS spectra using partial least squares, principal component analysis, and related techniques

    Energy Technology Data Exchange (ETDEWEB)

    Clegg, Samuel M [Los Alamos National Laboratory; Barefield, James E [Los Alamos National Laboratory; Wiens, Roger C [Los Alamos National Laboratory; Sklute, Elizabeth [MT HOLYOKE COLLEGE; Dyare, Melinda D [MT HOLYOKE COLLEGE

    2008-01-01

    Quantitative analysis with LIBS traditionally employs calibration curves that are complicated by the chemical matrix effects. These chemical matrix effects influence the LIBS plasma and the ratio of elemental composition to elemental emission line intensity. Consequently, LIBS calibration typically requires a priori knowledge of the unknown, in order for a series of calibration standards similar to the unknown to be employed. In this paper, three new Multivariate Analysis (MV A) techniques are employed to analyze the LIBS spectra of 18 disparate igneous and highly-metamorphosed rock samples. Partial Least Squares (PLS) analysis is used to generate a calibration model from which unknown samples can be analyzed. Principal Components Analysis (PCA) and Soft Independent Modeling of Class Analogy (SIMCA) are employed to generate a model and predict the rock type of the samples. These MV A techniques appear to exploit the matrix effects associated with the chemistries of these 18 samples.

  10. Time series analysis of infrared satellite data for detecting thermal anomalies: a hybrid approach

    Science.gov (United States)

    Koeppen, W. C.; Pilger, E.; Wright, R.

    2011-07-01

    We developed and tested an automated algorithm that analyzes thermal infrared satellite time series data to detect and quantify the excess energy radiated from thermal anomalies such as active volcanoes. Our algorithm enhances the previously developed MODVOLC approach, a simple point operation, by adding a more complex time series component based on the methods of the Robust Satellite Techniques (RST) algorithm. Using test sites at Anatahan and Kīlauea volcanoes, the hybrid time series approach detected ~15% more thermal anomalies than MODVOLC with very few, if any, known false detections. We also tested gas flares in the Cantarell oil field in the Gulf of Mexico as an end-member scenario representing very persistent thermal anomalies. At Cantarell, the hybrid algorithm showed only a slight improvement, but it did identify flares that were undetected by MODVOLC. We estimate that at least 80 MODIS images for each calendar month are required to create good reference images necessary for the time series analysis of the hybrid algorithm. The improved performance of the new algorithm over MODVOLC will result in the detection of low temperature thermal anomalies that will be useful in improving our ability to document Earth's volcanic eruptions, as well as detecting low temperature thermal precursors to larger eruptions.

  11. Analysis of time series and size of equivalent sample

    International Nuclear Information System (INIS)

    Bernal, Nestor; Molina, Alicia; Pabon, Daniel; Martinez, Jorge

    2004-01-01

    In a meteorological context, a first approach to the modeling of time series is to use models of autoregressive type. This allows one to take into account the meteorological persistence or temporal behavior, thereby identifying the memory of the analyzed process. This article seeks to pre-sent the concept of the size of an equivalent sample, which helps to identify in the data series sub periods with a similar structure. Moreover, in this article we examine the alternative of adjusting the variance of the series, keeping in mind its temporal structure, as well as an adjustment to the covariance of two time series. This article presents two examples, the first one corresponding to seven simulated series with autoregressive structure of first order, and the second corresponding to seven meteorological series of anomalies of the air temperature at the surface in two Colombian regions

  12. Cost analysis and estimating tools and techniques

    CERN Document Server

    Nussbaum, Daniel

    1990-01-01

    Changes in production processes reflect the technological advances permeat­ ing our products and services. U. S. industry is modernizing and automating. In parallel, direct labor is fading as the primary cost driver while engineering and technology related cost elements loom ever larger. Traditional, labor-based ap­ proaches to estimating costs are losing their relevance. Old methods require aug­ mentation with new estimating tools and techniques that capture the emerging environment. This volume represents one of many responses to this challenge by the cost analysis profession. The Institute of Cost Analysis (lCA) is dedicated to improving the effective­ ness of cost and price analysis and enhancing the professional competence of its members. We encourage and promote exchange of research findings and appli­ cations between the academic community and cost professionals in industry and government. The 1990 National Meeting in Los Angeles, jointly spo~sored by ICA and the National Estimating Society (NES),...

  13. Population estimation techniques for routing analysis

    International Nuclear Information System (INIS)

    Sathisan, S.K.; Chagari, A.K.

    1994-01-01

    A number of on-site and off-site factors affect the potential siting of a radioactive materials repository at Yucca Mountain, Nevada. Transportation related issues such route selection and design are among them. These involve evaluation of potential risks and impacts, including those related to population. Population characteristics (total population and density) are critical factors in the risk assessment, emergency preparedness and response planning, and ultimately in route designation. This paper presents an application of Geographic Information System (GIS) technology to facilitate such analyses. Specifically, techniques to estimate critical population information are presented. A case study using the highway network in Nevada is used to illustrate the analyses. TIGER coverages are used as the basis for population information at a block level. The data are then synthesized at tract, county and state levels of aggregation. Of particular interest are population estimates for various corridor widths along transport corridors -- ranging from 0.5 miles to 20 miles in this paper. A sensitivity analysis based on the level of data aggregation is also presented. The results of these analysis indicate that specific characteristics of the area and its population could be used as indicators to aggregate data appropriately for the analysis

  14. Techniques for Analysis of Plant Phenolic Compounds

    Directory of Open Access Journals (Sweden)

    Thomas H. Roberts

    2013-02-01

    Full Text Available Phenolic compounds are well-known phytochemicals found in all plants. They consist of simple phenols, benzoic and cinnamic acid, coumarins, tannins, lignins, lignans and flavonoids. Substantial developments in research focused on the extraction, identification and quantification of phenolic compounds as medicinal and/or dietary molecules have occurred over the last 25 years. Organic solvent extraction is the main method used to extract phenolics. Chemical procedures are used to detect the presence of total phenolics, while spectrophotometric and chromatographic techniques are utilized to identify and quantify individual phenolic compounds. This review addresses the application of different methodologies utilized in the analysis of phenolic compounds in plant-based products, including recent technical developments in the quantification of phenolics.

  15. Radio-analysis. Definitions and techniques

    International Nuclear Information System (INIS)

    Bourrel, F.; Courriere, Ph.

    2003-01-01

    This paper presents the different steps of the radio-labelling of a molecule for two purposes: the radio-immuno-analysis and the auto-radiography: 1 - definitions, radiations and radioprotection: activity of a radioactive source; half-life; radioactivity (alpha-, beta- and gamma radioactivity, internal conversion); radioprotection (irradiation, contamination); 2 - radionuclides used in medical biology and obtention of labelled molecules: gamma emitters ( 125 I, 57 Co); beta emitters; obtention of labelled molecules (general principles, high specific activity and choice of the tracer, molecule to be labelled); main labelling techniques (iodation, tritium); purification of the labelled compound (dialysis, gel-filtering or molecular exclusion chromatography, high performance liquid chromatography); quality estimation of the labelled compound (labelling efficiency calculation, immuno-reactivity conservation, stability and preservation). (J.S.)

  16. Interglacial climate dynamics and advanced time series analysis

    Science.gov (United States)

    Mudelsee, Manfred; Bermejo, Miguel; Köhler, Peter; Lohmann, Gerrit

    2013-04-01

    Studying the climate dynamics of past interglacials (IGs) helps to better assess the anthropogenically influenced dynamics of the current IG, the Holocene. We select the IG portions from the EPICA Dome C ice core archive, which covers the past 800 ka, to apply methods of statistical time series analysis (Mudelsee 2010). The analysed variables are deuterium/H (indicating temperature) (Jouzel et al. 2007), greenhouse gases (Siegenthaler et al. 2005, Loulergue et al. 2008, L¨ü thi et al. 2008) and a model-co-derived climate radiative forcing (Köhler et al. 2010). We select additionally high-resolution sea-surface-temperature records from the marine sedimentary archive. The first statistical method, persistence time estimation (Mudelsee 2002) lets us infer the 'climate memory' property of IGs. Second, linear regression informs about long-term climate trends during IGs. Third, ramp function regression (Mudelsee 2000) is adapted to look on abrupt climate changes during IGs. We compare the Holocene with previous IGs in terms of these mathematical approaches, interprete results in a climate context, assess uncertainties and the requirements to data from old IGs for yielding results of 'acceptable' accuracy. This work receives financial support from the Deutsche Forschungsgemeinschaft (Project ClimSens within the DFG Research Priority Program INTERDYNAMIK) and the European Commission (Marie Curie Initial Training Network LINC, No. 289447, within the 7th Framework Programme). References Jouzel J, Masson-Delmotte V, Cattani O, Dreyfus G, Falourd S, Hoffmann G, Minster B, Nouet J, Barnola JM, Chappellaz J, Fischer H, Gallet JC, Johnsen S, Leuenberger M, Loulergue L, Luethi D, Oerter H, Parrenin F, Raisbeck G, Raynaud D, Schilt A, Schwander J, Selmo E, Souchez R, Spahni R, Stauffer B, Steffensen JP, Stenni B, Stocker TF, Tison JL, Werner M, Wolff EW (2007) Orbital and millennial Antarctic climate variability over the past 800,000 years. Science 317:793. Köhler P, Bintanja R

  17. The Prediction of Teacher Turnover Employing Time Series Analysis.

    Science.gov (United States)

    Costa, Crist H.

    The purpose of this study was to combine knowledge of teacher demographic data with time-series forecasting methods to predict teacher turnover. Moving averages and exponential smoothing were used to forecast discrete time series. The study used data collected from the 22 largest school districts in Iowa, designated as FACT schools. Predictions…

  18. ON THE FOURIER AND WAVELET ANALYSIS OF CORONAL TIME SERIES

    International Nuclear Information System (INIS)

    Auchère, F.; Froment, C.; Bocchialini, K.; Buchlin, E.; Solomon, J.

    2016-01-01

    Using Fourier and wavelet analysis, we critically re-assess the significance of our detection of periodic pulsations in coronal loops. We show that the proper identification of the frequency dependence and statistical properties of the different components of the power spectra provides a strong argument against the common practice of data detrending, which tends to produce spurious detections around the cut-off frequency of the filter. In addition, the white and red noise models built into the widely used wavelet code of Torrence and Compo cannot, in most cases, adequately represent the power spectra of coronal time series, thus also possibly causing false positives. Both effects suggest that several reports of periodic phenomena should be re-examined. The Torrence and Compo code nonetheless effectively computes rigorous confidence levels if provided with pertinent models of mean power spectra, and we describe the appropriate manner in which to call its core routines. We recall the meaning of the default confidence levels output from the code, and we propose new Monte-Carlo-derived levels that take into account the total number of degrees of freedom in the wavelet spectra. These improvements allow us to confirm that the power peaks that we detected have a very low probability of being caused by noise.

  19. ON THE FOURIER AND WAVELET ANALYSIS OF CORONAL TIME SERIES

    Energy Technology Data Exchange (ETDEWEB)

    Auchère, F.; Froment, C.; Bocchialini, K.; Buchlin, E.; Solomon, J., E-mail: frederic.auchere@ias.u-psud.fr [Institut d’Astrophysique Spatiale, CNRS, Univ. Paris-Sud, Université Paris-Saclay, Bât. 121, F-91405 Orsay (France)

    2016-07-10

    Using Fourier and wavelet analysis, we critically re-assess the significance of our detection of periodic pulsations in coronal loops. We show that the proper identification of the frequency dependence and statistical properties of the different components of the power spectra provides a strong argument against the common practice of data detrending, which tends to produce spurious detections around the cut-off frequency of the filter. In addition, the white and red noise models built into the widely used wavelet code of Torrence and Compo cannot, in most cases, adequately represent the power spectra of coronal time series, thus also possibly causing false positives. Both effects suggest that several reports of periodic phenomena should be re-examined. The Torrence and Compo code nonetheless effectively computes rigorous confidence levels if provided with pertinent models of mean power spectra, and we describe the appropriate manner in which to call its core routines. We recall the meaning of the default confidence levels output from the code, and we propose new Monte-Carlo-derived levels that take into account the total number of degrees of freedom in the wavelet spectra. These improvements allow us to confirm that the power peaks that we detected have a very low probability of being caused by noise.

  20. Prediction of solar cycle 24 using fourier series analysis

    International Nuclear Information System (INIS)

    Khalid, M.; Sultana, M.; Zaidi, F.

    2014-01-01

    Predicting the behavior of solar activity has become very significant. It is due to its influence on Earth and the surrounding environment. Apt predictions of the amplitude and timing of the next solar cycle will aid in the estimation of the several results of Space Weather. In the past, many prediction procedures have been used and have been successful to various degrees in the field of solar activity forecast. In this study, Solar cycle 24 is forecasted by the Fourier series method. Comparative analysis has been made by auto regressive integrated moving averages method. From sources, January 2008 was the minimum preceding solar cycle 24, the amplitude and shape of solar cycle 24 is approximate on monthly number of sunspots. This forecast framework approximates a mean solar cycle 24, with the maximum appearing during May 2014 (+- 8 months), with most sunspot of 98 +- 10. Solar cycle 24 will be ending in June 2020 (+- 7 months). The difference between two consecutive peak values of solar cycles (i.e. solar cycle 23 and 24 ) is 165 months(+- 6 months). (author)

  1. Development of indicators of vegetation recovery based on time series analysis of SPOT Vegetation data

    Science.gov (United States)

    Lhermitte, S.; Tips, M.; Verbesselt, J.; Jonckheere, I.; Van Aardt, J.; Coppin, Pol

    2005-10-01

    Large-scale wild fires have direct impacts on natural ecosystems and play a major role in the vegetation ecology and carbon budget. Accurate methods for describing post-fire development of vegetation are therefore essential for the understanding and monitoring of terrestrial ecosystems. Time series analysis of satellite imagery offers the potential to quantify these parameters with spatial and temporal accuracy. Current research focuses on the potential of time series analysis of SPOT Vegetation S10 data (1999-2001) to quantify the vegetation recovery of large-scale burns detected in the framework of GBA2000. The objective of this study was to provide quantitative estimates of the spatio-temporal variation of vegetation recovery based on remote sensing indicators. Southern Africa was used as a pilot study area, given the availability of ground and satellite data. An automated technique was developed to extract consistent indicators of vegetation recovery from the SPOT-VGT time series. Reference areas were used to quantify the vegetation regrowth by means of Regeneration Indices (RI). Two kinds of recovery indicators (time and value- based) were tested for RI's of NDVI, SR, SAVI, NDWI, and pure band information. The effects of vegetation structure and temporal fire regime features on the recovery indicators were subsequently analyzed. Statistical analyses were conducted to assess whether the recovery indicators were different for different vegetation types and dependent on timing of the burning season. Results highlighted the importance of appropriate reference areas and the importance of correct normalization of the SPOT-VGT data.

  2. IBM i5iSeries Primer Concepts and Techniques for Programmers, Administrators, and System Operators

    CERN Document Server

    Forsythe, Kevin; Holt, Ted; Pence, Doug

    2012-01-01

    This comprehensive, 35-chapter book is the ultimate resource and a "must-have" for every professional working with the i5/iSeries. It is perfect for novice and intermediate programmers as well as for system administrators and operators. In simple, straightforward style, the authors explain core i5/iSeries concepts and show you step by step how to perform a wide variety of essential functions. The book includes sections on installation, operations, administration, system architecture, programming, the Internet, and troubleshooting. These sections are organized in free-standing style so you d

  3. Analysis of radiation-induced microchemical evolution in 300 series stainless steel

    International Nuclear Information System (INIS)

    Brager, H.R.; Garner, F.A.

    1980-03-01

    The irradiation of 300 series stainless steel by fast neutrons leads to an evolution of alloy microstructures that involves not only the formation of voids and dislocations, but also an extensive repartitioning of elements between various phases. This latter evolution has been shown to be the primary determinant of the alloy behavior in response to the large number of variables which influence void swelling and irradiation creep. The combined use of scanning transmission electron microscopy and energy-dispersive x-ray analysis has been the key element in the study of this phenomenon. Problems associated with the analysis of radioactive specimens are resolved by minor equipment modifications. Problems associated with spatial resolution limitations and the complexity and heterogeneity of the microchemical evolution have been overcome by using several data acquisition techniques. These include the measurement of compositional profiles near sinks, the use of foil-edge analysis, and the statistical sampling of many matrix and precipitate volumes

  4. Flame analysis using image processing techniques

    Science.gov (United States)

    Her Jie, Albert Chang; Zamli, Ahmad Faizal Ahmad; Zulazlan Shah Zulkifli, Ahmad; Yee, Joanne Lim Mun; Lim, Mooktzeng

    2018-04-01

    This paper presents image processing techniques with the use of fuzzy logic and neural network approach to perform flame analysis. Flame diagnostic is important in the industry to extract relevant information from flame images. Experiment test is carried out in a model industrial burner with different flow rates. Flame features such as luminous and spectral parameters are extracted using image processing and Fast Fourier Transform (FFT). Flame images are acquired using FLIR infrared camera. Non-linearities such as thermal acoustic oscillations and background noise affect the stability of flame. Flame velocity is one of the important characteristics that determines stability of flame. In this paper, an image processing method is proposed to determine flame velocity. Power spectral density (PSD) graph is a good tool for vibration analysis where flame stability can be approximated. However, a more intelligent diagnostic system is needed to automatically determine flame stability. In this paper, flame features of different flow rates are compared and analyzed. The selected flame features are used as inputs to the proposed fuzzy inference system to determine flame stability. Neural network is used to test the performance of the fuzzy inference system.

  5. Analysis of obsidians by PIXE technique

    International Nuclear Information System (INIS)

    Nuncio Q, A.E.

    1998-01-01

    This work presents the characterization of obsydian samples from different mineral sites in Mexico, undertaken by an Ion Beam Analysis: PIXE (Proton Induced X-ray Emission). As part of an intensive investigation of obsidian in Mesoamerica by anthropologists from Mexico National Institute of Anthropology and History, 818 samples were collected from different volcanic sources in central Mexico for the purpose of establishing a data bank of element concentrations of each source. Part of this collection was analyzed by Neutron activation analysis and most of the important elements concentrations reported. In this work, a non-destructive IBA technique (PIXE) are used to analyze obsydian samples. The application of this technique were carried out at laboratories of the ININ Nuclear Center facilities. The samples consisted of of obsydians from ten different volcanic sources. This pieces were mounted on a sample holder designed for the purpose of exposing each sample to the proton beam. This PIXE analysis was carried out with an ET Tandem Accelerator at the ININ. X-ray spectrometry was carried out with an external beam facility employing a Si(Li) detector set at 52.5 degrees in relation to the target normal (parallel to the beam direction) and 4.2 cm away from the target center. A filter was set in front of the detector, to determine the best attenuation conditions to obtain most of the elements, taking into account that X-ray spectra from obsydians are dominated by intense major elements lines. Thus, a 28 μ m- thick aluminium foil absorber was selected and used to reduce the intensity of the major lines as well as pile-up effects. The mean proton energy was 2.62 MeV, and the beam profile was about 4 mm in diameter. As results were founded elemental concentrations of a set of samples from ten different sources: Altotonga (Veracruz), Penjamo (Guanajuato), Otumba (Mexico), Zinapecuaro (Michoacan), Ucareo (Michoacan), Tres Cabezas (Puebla), Sierra Navajas (Hidalgo), Zaragoza

  6. Applying data mining techniques to medical time series: an empirical case study in electroencephalography and stabilometry

    Directory of Open Access Journals (Sweden)

    A. Anguera

    2016-01-01

    This paper illustrates the application of different knowledge discovery techniques for the purposes of classification within the above domains. The accuracy of this application for the two classes considered in each case is 99.86% and 98.11% for epilepsy diagnosis in the electroencephalography (EEG domain and 99.4% and 99.1% for early-age sports talent classification in the stabilometry domain. The KDD techniques achieve better results than other traditional neural network-based classification techniques.

  7. Handbook of Qualitative Research Techniques and Analysis in Entrepreneurship

    DEFF Research Database (Denmark)

    One of the most challenging tasks in the research design process is choosing the most appropriate data collection and analysis techniques. This Handbook provides a detailed introduction to five qualitative data collection and analysis techniques pertinent to exploring entreprneurial phenomena....

  8. Analysis of complex time series using refined composite multiscale entropy

    International Nuclear Information System (INIS)

    Wu, Shuen-De; Wu, Chiu-Wen; Lin, Shiou-Gwo; Lee, Kung-Yen; Peng, Chung-Kang

    2014-01-01

    Multiscale entropy (MSE) is an effective algorithm for measuring the complexity of a time series that has been applied in many fields successfully. However, MSE may yield an inaccurate estimation of entropy or induce undefined entropy because the coarse-graining procedure reduces the length of a time series considerably at large scales. Composite multiscale entropy (CMSE) was recently proposed to improve the accuracy of MSE, but it does not resolve undefined entropy. Here we propose a refined composite multiscale entropy (RCMSE) to improve CMSE. For short time series analyses, we demonstrate that RCMSE increases the accuracy of entropy estimation and reduces the probability of inducing undefined entropy.

  9. Time series analysis of soil Radon-222 recorded at Kutch region, Gujarat, India

    International Nuclear Information System (INIS)

    Madhusudan Rao, K.; Rastogi, B.K.; Barman, Chiranjib; Chaudhuri, Hirok

    2013-01-01

    Kutch region in Gujarat lies in a seismic vulnerable zone (seismic zone-v). After the devastating Bhuj earthquake (7.7M) of January 26, 2001 in the Kutch region several researcher focused their attention to monitor geophysical and geochemical precursors for earthquakes in the region. In order to find out the possible geochemical precursory signals for earthquake events, we monitored radioactive gas radon-222 in sub surface soil gas at Kutch region. We have analysed the recorded soil radon-222 time series by means of nonlinear techniques such as FFT power spectral analysis, empirical mode decomposition, multi-fractal analysis along with other linear statistical methods. Some fascinating and fruitful results originated out the nonlinear analysis of the said time series have been discussed in the present paper. The entire analytical method aided us to recognize the nature and pattern of soil radon-222 emanation process. Moreover the recording and statistical and non-linear analysis of soil radon data at Kutch region will assist us to understand the preparation phase of an imminent seismic event in the region. (author)

  10. Unsupervised land cover change detection: meaningful sequential time series analysis

    CSIR Research Space (South Africa)

    Salmon, BP

    2011-06-01

    Full Text Available An automated land cover change detection method is proposed that uses coarse spatial resolution hyper-temporal earth observation satellite time series data. The study compared three different unsupervised clustering approaches that operate on short...

  11. Correlation between detrended fluctuation analysis and the Lempel-Ziv complexity in nonlinear time series analysis

    International Nuclear Information System (INIS)

    Tang You-Fu; Liu Shu-Lin; Jiang Rui-Hong; Liu Ying-Hui

    2013-01-01

    We study the correlation between detrended fluctuation analysis (DFA) and the Lempel-Ziv complexity (LZC) in nonlinear time series analysis in this paper. Typical dynamic systems including a logistic map and a Duffing model are investigated. Moreover, the influence of Gaussian random noise on both the DFA and LZC are analyzed. The results show a high correlation between the DFA and LZC, which can quantify the non-stationarity and the nonlinearity of the time series, respectively. With the enhancement of the random component, the exponent a and the normalized complexity index C show increasing trends. In addition, C is found to be more sensitive to the fluctuation in the nonlinear time series than α. Finally, the correlation between the DFA and LZC is applied to the extraction of vibration signals for a reciprocating compressor gas valve, and an effective fault diagnosis result is obtained

  12. Advances in Antithetic Time Series Analysis : Separating Fact from Artifact

    Directory of Open Access Journals (Sweden)

    Dennis Ridley

    2016-01-01

    Full Text Available The problem of biased time series mathematical model parameter estimates is well known to be insurmountable. When used to predict future values by extrapolation, even a de minimis bias will eventually grow into a large bias, with misleading results. This paper elucidates how combining antithetic time series' solves this baffling problem of bias in the fitted and forecast values by dynamic bias cancellation. Instead of growing to infinity, the average error can converge to a constant. (original abstract

  13. Data imputation analysis for Cosmic Rays time series

    Science.gov (United States)

    Fernandes, R. C.; Lucio, P. S.; Fernandez, J. H.

    2017-05-01

    The occurrence of missing data concerning Galactic Cosmic Rays time series (GCR) is inevitable since loss of data is due to mechanical and human failure or technical problems and different periods of operation of GCR stations. The aim of this study was to perform multiple dataset imputation in order to depict the observational dataset. The study has used the monthly time series of GCR Climax (CLMX) and Roma (ROME) from 1960 to 2004 to simulate scenarios of 10%, 20%, 30%, 40%, 50%, 60%, 70%, 80% and 90% of missing data compared to observed ROME series, with 50 replicates. Then, the CLMX station as a proxy for allocation of these scenarios was used. Three different methods for monthly dataset imputation were selected: AMÉLIA II - runs the bootstrap Expectation Maximization algorithm, MICE - runs an algorithm via Multivariate Imputation by Chained Equations and MTSDI - an Expectation Maximization algorithm-based method for imputation of missing values in multivariate normal time series. The synthetic time series compared with the observed ROME series has also been evaluated using several skill measures as such as RMSE, NRMSE, Agreement Index, R, R2, F-test and t-test. The results showed that for CLMX and ROME, the R2 and R statistics were equal to 0.98 and 0.96, respectively. It was observed that increases in the number of gaps generate loss of quality of the time series. Data imputation was more efficient with MTSDI method, with negligible errors and best skill coefficients. The results suggest a limit of about 60% of missing data for imputation, for monthly averages, no more than this. It is noteworthy that CLMX, ROME and KIEL stations present no missing data in the target period. This methodology allowed reconstructing 43 time series.

  14. Geomechanical time series and its singularity spectrum analysis

    Czech Academy of Sciences Publication Activity Database

    Lyubushin, Alexei A.; Kaláb, Zdeněk; Lednická, Markéta

    2012-01-01

    Roč. 47, č. 1 (2012), s. 69-77 ISSN 1217-8977 R&D Projects: GA ČR GA105/09/0089 Institutional research plan: CEZ:AV0Z30860518 Keywords : geomechanical time series * singularity spectrum * time series segmentation * laser distance meter Subject RIV: DC - Siesmology, Volcanology, Earth Structure Impact factor: 0.347, year: 2012 http://www.akademiai.com/content/88v4027758382225/fulltext.pdf

  15. Techniques and Applications of Urban Data Analysis

    KAUST Repository

    AlHalawani, Sawsan N.

    2016-05-26

    Digitization and characterization of urban spaces are essential components as we move to an ever-growing ’always connected’ world. Accurate analysis of such digital urban spaces has become more important as we continue to get spatial and social context-aware feedback and recommendations in our daily activities. Modeling and reconstruction of urban environments have thus gained unprecedented importance in the last few years. Such analysis typically spans multiple disciplines, such as computer graphics, and computer vision as well as architecture, geoscience, and remote sensing. Reconstructing an urban environment usually requires an entire pipeline consisting of different tasks. In such a pipeline, data analysis plays a strong role in acquiring meaningful insights from the raw data. This dissertation primarily focuses on the analysis of various forms of urban data and proposes a set of techniques to extract useful information, which is then used for different applications. The first part of this dissertation presents a semi-automatic framework to analyze facade images to recover individual windows along with their functional configurations such as open or (partially) closed states. The main advantage of recovering both the repetition patterns of windows and their individual deformation parameters is to produce a factored facade representation. Such a factored representation enables a range of applications including interactive facade images, improved multi-view stereo reconstruction, facade-level change detection, and novel image editing possibilities. The second part of this dissertation demonstrates the importance of a layout configuration on its performance. As a specific application scenario, I investigate the interior layout of warehouses wherein the goal is to assign items to their storage locations while reducing flow congestion and enhancing the speed of order picking processes. The third part of the dissertation proposes a method to classify cities

  16. All-phase MR angiography using independent component analysis of dynamic contrast enhanced MRI time series. φ-MRA

    International Nuclear Information System (INIS)

    Suzuki, Kiyotaka; Matsuzawa, Hitoshi; Watanabe, Masaki; Nakada, Tsutomu; Nakayama, Naoki; Kwee, I.L.

    2003-01-01

    Dynamic contrast enhanced magnetic resonance imaging (dynamic MRI) represents a MRI version of non-diffusible tracer methods, the main clinical use of which is the physiological construction of what is conventionally referred to as perfusion images. The raw data utilized for constructing MRI perfusion images are time series of pixel signal alterations associated with the passage of a gadolinium containing contrast agent. Such time series are highly compatible with independent component analysis (ICA), a novel statistical signal processing technique capable of effectively separating a single mixture of multiple signals into their original independent source signals (blind separation). Accordingly, we applied ICA to dynamic MRI time series. The technique was found to be powerful, allowing for hitherto unobtainable assessment of regional cerebral hemodynamics in vivo. (author)

  17. Numerical modeling techniques for flood analysis

    Science.gov (United States)

    Anees, Mohd Talha; Abdullah, K.; Nawawi, M. N. M.; Ab Rahman, Nik Norulaini Nik; Piah, Abd. Rahni Mt.; Zakaria, Nor Azazi; Syakir, M. I.; Mohd. Omar, A. K.

    2016-12-01

    Topographic and climatic changes are the main causes of abrupt flooding in tropical areas. It is the need to find out exact causes and effects of these changes. Numerical modeling techniques plays a vital role for such studies due to their use of hydrological parameters which are strongly linked with topographic changes. In this review, some of the widely used models utilizing hydrological and river modeling parameters and their estimation in data sparse region are discussed. Shortcomings of 1D and 2D numerical models and the possible improvements over these models through 3D modeling are also discussed. It is found that the HEC-RAS and FLO 2D model are best in terms of economical and accurate flood analysis for river and floodplain modeling respectively. Limitations of FLO 2D in floodplain modeling mainly such as floodplain elevation differences and its vertical roughness in grids were found which can be improve through 3D model. Therefore, 3D model was found to be more suitable than 1D and 2D models in terms of vertical accuracy in grid cells. It was also found that 3D models for open channel flows already developed recently but not for floodplain. Hence, it was suggested that a 3D model for floodplain should be developed by considering all hydrological and high resolution topographic parameter's models, discussed in this review, to enhance the findings of causes and effects of flooding.

  18. Charting the trends in nuclear techniques for analysis of inorganic environmental pollutants

    International Nuclear Information System (INIS)

    Braun, T.

    1986-01-01

    Publications in Analytical Abstracts in the period 1975-1984 and papers presented at the Modern Trends in Activation Analysis international conferences series in the period 1961-1986 have been used as an empirical basis for assessing general trends in research and publication activity. Some ebbs and flows in the speciality of instrumental techniques for analysis of environmental trace pollutants are revealed by a statistical analysis of the publications. (author)

  19. Analysis of cyclical behavior in time series of stock market returns

    Science.gov (United States)

    Stratimirović, Djordje; Sarvan, Darko; Miljković, Vladimir; Blesić, Suzana

    2018-01-01

    In this paper we have analyzed scaling properties and cyclical behavior of the three types of stock market indexes (SMI) time series: data belonging to stock markets of developed economies, emerging economies, and of the underdeveloped or transitional economies. We have used two techniques of data analysis to obtain and verify our findings: the wavelet transform (WT) spectral analysis to identify cycles in the SMI returns data, and the time-dependent detrended moving average (tdDMA) analysis to investigate local behavior around market cycles and trends. We found cyclical behavior in all SMI data sets that we have analyzed. Moreover, the positions and the boundaries of cyclical intervals that we found seam to be common for all markets in our dataset. We list and illustrate the presence of nine such periods in our SMI data. We report on the possibilities to differentiate between the level of growth of the analyzed markets by way of statistical analysis of the properties of wavelet spectra that characterize particular peak behaviors. Our results show that measures like the relative WT energy content and the relative WT amplitude of the peaks in the small scales region could be used to partially differentiate between market economies. Finally, we propose a way to quantify the level of development of a stock market based on estimation of local complexity of market's SMI series. From the local scaling exponents calculated for our nine peak regions we have defined what we named the Development Index, which proved, at least in the case of our dataset, to be suitable to rank the SMI series that we have analyzed in three distinct groups.

  20. Modeling of human operator dynamics in simple manual control utilizing time series analysis. [tracking (position)

    Science.gov (United States)

    Agarwal, G. C.; Osafo-Charles, F.; Oneill, W. D.; Gottlieb, G. L.

    1982-01-01

    Time series analysis is applied to model human operator dynamics in pursuit and compensatory tracking modes. The normalized residual criterion is used as a one-step analytical tool to encompass the processes of identification, estimation, and diagnostic checking. A parameter constraining technique is introduced to develop more reliable models of human operator dynamics. The human operator is adequately modeled by a second order dynamic system both in pursuit and compensatory tracking modes. In comparing the data sampling rates, 100 msec between samples is adequate and is shown to provide better results than 200 msec sampling. The residual power spectrum and eigenvalue analysis show that the human operator is not a generator of periodic characteristics.

  1. Time Series Analysis of Wheat flour Price Shocks in Pakistan: A Case Analysis

    OpenAIRE

    Asad Raza Abdi; Ali Hassan Halepoto; Aisha Bashir Shah; Faiz M. Shaikh

    2013-01-01

    The current research investigates the wheat flour Price Shocks in Pakistan: A case analysis. Data was collected by using secondary sources by using Time series Analysis, and data were analyzed by using SPSS-20 version. It was revealed that the price of wheat flour increases from last four decades, and trend of price shocks shows that due to certain market variation and supply and demand shocks also play a positive relationship in price shocks in the wheat prices. It was further revealed th...

  2. Time series analysis of reference crop evapotranspiration for Bokaro District, Jharkhand, India

    Directory of Open Access Journals (Sweden)

    Gautam Ratnesh

    2016-09-01

    Full Text Available Evapotranspiration is the one of the major role playing element in water cycle. More accurate measurement and forecasting of Evapotranspiration would enable more efficient water resources management. This study, is therefore, particularly focused on evapotranspiration modelling and forecasting, since forecasting would provide better information for optimal water resources management. There are numerous techniques of evapotranspiration forecasting that include autoregressive (AR and moving average (MA, autoregressive moving average (ARMA, autoregressive integrated moving average (ARIMA, Thomas Feiring, etc. Out of these models ARIMA model has been found to be more suitable for analysis and forecasting of hydrological events. Therefore, in this study ARIMA models have been used for forecasting of mean monthly reference crop evapotranspiration by stochastic analysis. The data series of 102 years i.e. 1224 months of Bokaro District were used for analysis and forecasting. Different order of ARIMA model was selected on the basis of autocorrelation function (ACF and partial autocorrelation (PACF of data series. Maximum likelihood method was used for determining the parameters of the models. To see the statistical parameter of model, best fitted model is ARIMA (0, 1, 4 (0, 1, 112.

  3. Track Irregularity Time Series Analysis and Trend Forecasting

    Directory of Open Access Journals (Sweden)

    Jia Chaolong

    2012-01-01

    Full Text Available The combination of linear and nonlinear methods is widely used in the prediction of time series data. This paper analyzes track irregularity time series data by using gray incidence degree models and methods of data transformation, trying to find the connotative relationship between the time series data. In this paper, GM (1,1 is based on first-order, single variable linear differential equations; after an adaptive improvement and error correction, it is used to predict the long-term changing trend of track irregularity at a fixed measuring point; the stochastic linear AR, Kalman filtering model, and artificial neural network model are applied to predict the short-term changing trend of track irregularity at unit section. Both long-term and short-term changes prove that the model is effective and can achieve the expected accuracy.

  4. Parametric time series analysis of geoelectrical signals: an application to earthquake forecasting in Southern Italy

    Directory of Open Access Journals (Sweden)

    V. Tramutoli

    1996-06-01

    Full Text Available An autoregressive model was selected to describe geoelectrical time series. An objective technique was subsequently applied to analyze and discriminate values above (below an a priorifixed threshold possibly related to seismic events. A complete check of the model and the main guidelines to estimate the occurrence probability of extreme events are reported. A first application of the proposed technique is discussed through the analysis of the experimental data recorded by an automatic station located in Tito, a small town on the Apennine chain in Southern Italy. This region was hit by the November 1980 Irpinia-Basilicata earthquake and it is one of most active areas of the Mediterranean region. After a preliminary filtering procedure to reduce the influence of external parameters (i.e. the meteo-climatic effects, it was demonstrated that the geoelectrical residual time series are well described by means of a second order autoregressive model. Our findings outline a statistical methodology to evaluate the efficiency of electrical seismic precursors.

  5. Analysis of historical series of industrial demand of energy; Analisi delle serie storiche dei consumi energetici dell`industria

    Energy Technology Data Exchange (ETDEWEB)

    Moauro, F. [ENEA, Centro Ricerche Casaccia, Rome (Italy). Dip. Energia

    1995-03-01

    This paper reports a short term analysis of the Italian demand for energy fonts and a check of a statistic model supposing the industrial demand for energy fonts as a function of prices and production, according to neoclassic neoclassic micro economic theory. To this pourpose monthly time series of industrial consumption of main energy fonts in 6 sectors, industrial production indexes in the same sectors and indexes of energy prices (coal, natural gas, oil products, electricity) have been used. The statistic methodology refers to modern analysis of time series and specifically to transfer function models. These ones permit rigorous identification and representation of the most important dynamic relations between dependent variables (production and prices), as relation of an input-output system. The results have shown an important positive correlation between energy consumption with prices. Furthermore, it has been shown the reliability of forecasts and their use as monthly energy indicators.

  6. Chaos in Electronic Circuits: Nonlinear Time Series Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Wheat, Jr., Robert M. [Kennedy Western Univ., Cheyenne, WY (United States)

    2003-07-01

    Chaos in electronic circuits is a phenomenon that has been largely ignored by engineers, manufacturers, and researchers until the early 1990’s and the work of Chua, Matsumoto, and others. As the world becomes more dependent on electronic devices, the detrimental effects of non-normal operation of these devices becomes more significant. Developing a better understanding of the mechanisms involved in the chaotic behavior of electronic circuits is a logical step toward the prediction and prevention of any potentially catastrophic occurrence of this phenomenon. Also, a better understanding of chaotic behavior, in a general sense, could potentially lead to better accuracy in the prediction of natural events such as weather, volcanic activity, and earthquakes. As a first step in this improvement of understanding, and as part of the research being reported here, methods of computer modeling, identifying and analyzing, and producing chaotic behavior in simple electronic circuits have been developed. The computer models were developed using both the Alternative Transient Program (ATP) and Spice, the analysis techniques have been implemented using the C and C++ programming languages, and the chaotically behaving circuits developed using “off the shelf” electronic components.

  7. Financing Human Development for Sectorial Growth: A Time Series Analysis

    Directory of Open Access Journals (Sweden)

    Shobande Abdul Olatunji

    2017-06-01

    Full Text Available The role which financing human development plays in fostering the sectorial growth of an economy cannot be undermined. It is a key instrument which can be utilized to alleviate poverty, create employment and ensure the sustenance of economic growth and development. Thus financing human development for sectorial growth has taken the center stage of economic growth and development strategies in most countries. In a constructive effort to examine the in-depth relationship between the variables in the Nigerian space, this paper provides evidence on the impact of financing human development and sectorial growth in Nigeria between 1982 and 2016, using the Johansen co-integration techniques to test for co-integration among the variables and the Vector Error Correction Model (VECM to ascertain the speed of adjustment of the variables to their long run equilibrium position. The analysis shows that a long and short run relationship exists between financing human capital development and sectorial growth during the period reviewed. Therefore, the paper argues that for an active foundation for sustainable sectorial growth and development, financing human capital development across each unit is urgently required through increased budgetary allocation for both health and educational sectors since they are key components of human capital development in a nation.

  8. Successful treatment of rare-earth magnet ingestion via minimally invasive techniques: a case series.

    Science.gov (United States)

    Kosut, Jessica S; Johnson, Sidney M; King, Jeremy L; Garnett, Gwendolyn; Woo, Russell K

    2013-04-01

    Cases of rare-earth magnet ingestions have been increasingly reported in the literature. However, these descriptions have focused on the severity of the injuries, rather than the clinical presentation and/or therapeutic approach. We report a series of eight children, ranging in age from 2 to 10 years, who ingested powerful rare-earth magnets. The rare-earth magnets were marketed in 2009 under the trade name Buckyballs(®) (Maxfield & Oberton, New York, NY). They are about 5 mm in size, spherical, and brightly colored, making them appealing for young children to play with and place in their mouths. Three children presented within hours of ingestion, and the magnets were successfully removed via endoscopy in two, whereas the third child required laparoscopy. No fistulas were found in these children. A fourth child presented 2 days after ingestion with evidence of bowel wall erosion, but without fistula formation; the magnets were removed via laparoscopy. A fifth child ingested nine magnets in a ring formation, which were removed via colonoscopy without evidence of injury or fistula formation. The three remaining children presented late (5-8 days after ingestion) and were found to have associated fistulas. They were treated successfully with a combination of endoscopy and laparoscopy with fluoroscopy. None of the children in our series required an open surgical procedure. All children were discharged home without complications. This case series highlights the potential dangers of rare-earth magnet ingestion in children. Our experience suggests that prompt intervention using minimally invasive approaches can lead to successful outcomes.

  9. Unstable Periodic Orbit Analysis of Histograms of Chaotic Time Series

    International Nuclear Information System (INIS)

    Zoldi, S.M.

    1998-01-01

    Using the Lorenz equations, we have investigated whether unstable periodic orbits (UPOs) associated with a strange attractor may predict the occurrence of the robust sharp peaks in histograms of some experimental chaotic time series. Histograms with sharp peaks occur for the Lorenz parameter value r=60.0 but not for r=28.0 , and the sharp peaks for r=60.0 do not correspond to a histogram derived from any single UPO. However, we show that histograms derived from the time series of a non-Axiom-A chaotic system can be accurately predicted by an escape-time weighting of UPO histograms. copyright 1998 The American Physical Society

  10. Minimum entropy density method for the time series analysis

    Science.gov (United States)

    Lee, Jeong Won; Park, Joongwoo Brian; Jo, Hang-Hyun; Yang, Jae-Suk; Moon, Hie-Tae

    2009-01-01

    The entropy density is an intuitive and powerful concept to study the complicated nonlinear processes derived from physical systems. We develop the minimum entropy density method (MEDM) to detect the structure scale of a given time series, which is defined as the scale in which the uncertainty is minimized, hence the pattern is revealed most. The MEDM is applied to the financial time series of Standard and Poor’s 500 index from February 1983 to April 2006. Then the temporal behavior of structure scale is obtained and analyzed in relation to the information delivery time and efficient market hypothesis.

  11. Quantitative blood flow analysis with digital techniques

    International Nuclear Information System (INIS)

    Forbes, G.

    1984-01-01

    The general principles of digital techniques in quantitating absolute blood flow during arteriography are described. Results are presented for a phantom constructed to correlate digitally calculated absolute flow with direct flow measurements. The clinical use of digital techniques in cerebrovascular angiography is briefly described. (U.K.)

  12. Characteristic vector analysis of inflection ratio spectra: New technique for analysis of ocean color data

    Science.gov (United States)

    Grew, G. W.

    1985-01-01

    Characteristic vector analysis applied to inflection ratio spectra is a new approach to analyzing spectral data. The technique applied to remote data collected with the multichannel ocean color sensor (MOCS), a passive sensor, simultaneously maps the distribution of two different phytopigments, chlorophyll alpha and phycoerythrin, the ocean. The data set presented is from a series of warm core ring missions conducted during 1982. The data compare favorably with a theoretical model and with data collected on the same mission by an active sensor, the airborne oceanographic lidar (AOL).

  13. Supraretinacular endoscopic carpal tunnel release: surgical technique with prospective case series.

    Science.gov (United States)

    Ecker, J; Perera, N; Ebert, J

    2015-02-01

    Current techniques for endoscopic carpal tunnel release use an infraretinacular approach, inserting the endoscope deep to the flexor retinaculum. We present a supraretinacular endoscopic carpal tunnel release technique in which a dissecting endoscope is inserted superficial to the flexor retinaculum, which improves vision and the ability to dissect and manipulate the median nerve and tendons during surgery. The motor branch of the median nerve and connections between the median and ulnar nerve can be identified and dissected. Because the endoscope is inserted superficial to the flexor retinaculum, the median nerve is not compressed before division of the retinaculum and, as a result, we have observed no cases of the transient median nerve deficits that have been reported using infraretinacular endoscopic techniques. © The Author(s) 2014.

  14. TAPP - Stuttgart technique and result of a large single center series

    Directory of Open Access Journals (Sweden)

    Bittner R

    2006-01-01

    Full Text Available Laparoscopic hernioplasty is assessed as a difficult operation. Operative technique determines the frequency of complications, the time of recovery and the rate of recurrences. A proper technique is absolutely necessary to achieve results that are superior to open hernia surgery. Technique: The key points in our technique are 1 use of nondisposable instruments; 2 use of blunt trocars, consisting of expanding and non-incisive cone-shaped tips; 3 spacious and curved opening to the peritoneum, high above all possible hernia openings; 4 meticulous dissection of the entire pelvic floor; 5 complete reduction of the hernial sac; 6 wide parietalization of the peritoneal sac, at least down to the mid of psoas muscle; 7 implantation of a large mesh, at least 10 cm x 15 cm; 8 fixation of the mesh by clip to Cooper′s ligament, to the rectus muscle and lateral to the epigastric vessels, high above the ileopubic tract; 9 the use of glue allows fixation also to the latero-caudial region; and 10 closure of the peritoneum by running suture. Results: With this technique in 12,678 hernia repairs, the following results could be achieved: operating time - 40 min; morbidity - 2.9%; recurrence rate - 0.7%; disability of work - 14 days. In all types of hernias (recurrence after previous open surgery, recurrence after previous preperitoneal operation, scrotal hernia, hernia in patients after transabdominal prostate resection, similar results could be achieved. Summary: Laparoscopic hernia repair can be performed successfully in clinical practice even by surgeons in training. Precondition for the success is a strictly standardized operative technique and a well-structured educational program.

  15. Dynamic Factor Analysis of Nonstationary Multivariate Time Series.

    Science.gov (United States)

    Molenaar, Peter C. M.; And Others

    1992-01-01

    The dynamic factor model proposed by P. C. Molenaar (1985) is exhibited, and a dynamic nonstationary factor model (DNFM) is constructed with latent factor series that have time-varying mean functions. The use of a DNFM is illustrated using data from a television viewing habits study. (SLD)

  16. Seasonal and annual precipitation time series trend analysis in North Carolina, United States

    Science.gov (United States)

    Sayemuzzaman, Mohammad; Jha, Manoj K.

    2014-02-01

    The present study performs the spatial and temporal trend analysis of the annual and seasonal time-series of a set of uniformly distributed 249 stations precipitation data across the state of North Carolina, United States over the period of 1950-2009. The Mann-Kendall (MK) test, the Theil-Sen approach (TSA) and the Sequential Mann-Kendall (SQMK) test were applied to quantify the significance of trend, magnitude of trend, and the trend shift, respectively. Regional (mountain, piedmont and coastal) precipitation trends were also analyzed using the above-mentioned tests. Prior to the application of statistical tests, the pre-whitening technique was used to eliminate the effect of autocorrelation of precipitation data series. The application of the above-mentioned procedures has shown very notable statewide increasing trend for winter and decreasing trend for fall precipitation. Statewide mixed (increasing/decreasing) trend has been detected in annual, spring, and summer precipitation time series. Significant trends (confidence level ≥ 95%) were detected only in 8, 7, 4 and 10 nos. of stations (out of 249 stations) in winter, spring, summer, and fall, respectively. Magnitude of the highest increasing (decreasing) precipitation trend was found about 4 mm/season (- 4.50 mm/season) in fall (summer) season. Annual precipitation trend magnitude varied between - 5.50 mm/year and 9 mm/year. Regional trend analysis found increasing precipitation in mountain and coastal regions in general except during the winter. Piedmont region was found to have increasing trends in summer and fall, but decreasing trend in winter, spring and on an annual basis. The SQMK test on "trend shift analysis" identified a significant shift during 1960 - 70 in most parts of the state. Finally, the comparison between winter (summer) precipitations with the North Atlantic Oscillation (Southern Oscillation) indices concluded that the variability and trend of precipitation can be explained by the

  17. Complexity analysis of the turbulent environmental fluid flow time series

    Science.gov (United States)

    Mihailović, D. T.; Nikolić-Đorić, E.; Drešković, N.; Mimić, G.

    2014-02-01

    We have used the Kolmogorov complexities, sample and permutation entropies to quantify the randomness degree in river flow time series of two mountain rivers in Bosnia and Herzegovina, representing the turbulent environmental fluid, for the period 1926-1990. In particular, we have examined the monthly river flow time series from two rivers (the Miljacka and the Bosnia) in the mountain part of their flow and then calculated the Kolmogorov complexity (KL) based on the Lempel-Ziv Algorithm (LZA) (lower-KLL and upper-KLU), sample entropy (SE) and permutation entropy (PE) values for each time series. The results indicate that the KLL, KLU, SE and PE values in two rivers are close to each other regardless of the amplitude differences in their monthly flow rates. We have illustrated the changes in mountain river flow complexity by experiments using (i) the data set for the Bosnia River and (ii) anticipated human activities and projected climate changes. We have explored the sensitivity of considered measures in dependence on the length of time series. In addition, we have divided the period 1926-1990 into three subintervals: (a) 1926-1945, (b) 1946-1965, (c) 1966-1990, and calculated the KLL, KLU, SE, PE values for the various time series in these subintervals. It is found that during the period 1946-1965, there is a decrease in their complexities, and corresponding changes in the SE and PE, in comparison to the period 1926-1990. This complexity loss may be primarily attributed to (i) human interventions, after the Second World War, on these two rivers because of their use for water consumption and (ii) climate change in recent times.

  18. Renal transplant lithiasis: analysis of our series and review of the literature.

    Science.gov (United States)

    Stravodimos, Konstantinos G; Adamis, Stefanos; Tyritzis, Stavros; Georgios, Zavos; Constantinides, Constantinos A

    2012-01-01

    Renal transplant lithiasis represents a rather uncommon complication. Even rare, it can result in significant morbidity and a devastating loss of renal function if obstruction occurs. We present our experience with graft lithiasis in our series of renal transplantations and review the literature regarding the epidemiology, pathophysiology, and current therapeutic strategies in the management of renal transplant lithiasis. In a retrospective analysis of a consecutive series of 1525 renal transplantations that were performed between January 1983 and March 2007, 7 patients were found to have allograft lithiasis. In five cases, the calculi were localized in the renal unit, and in two cases, in the ureter. A review in the English language was also performed of the Medline and PubMed databases using the keywords renal transplant lithiasis, donor-gifted lithiasis, and urological complications after kidney transplantation. Several retrospective studies regarding the incidence, etiology, as well as predisposing factors for graft lithiasis were reviewed. Data regarding the current therapeutic strategies for graft lithiasis were also evaluated, and outcomes were compared with the results of our series. Most studies report a renal transplant lithiasis incidence of 0.4% to 1%. In our series, incidence of graft lithiasis was 0.46% (n=7). Of the seven patients, three were treated via percutaneous nephrolithotripsy (PCNL); in three patients, shockwave lithotripsy (SWL) was performed; and in a single case, spontaneous passage of a urinary calculus was observed. All patients are currently stone free but still remain under close urologic surveillance. Renal transplant lithiasis requires vigilance, a high index of suspicion, prompt recognition, and management. Treatment protocols should mimic those for solitary kidneys. Minimally invasive techniques are available to remove graft calculi. Long-term follow-up is essential to determine the outcome, as well as to prevent recurrence.

  19. Spatially adaptive mixture modeling for analysis of FMRI time series.

    Science.gov (United States)

    Vincent, Thomas; Risser, Laurent; Ciuciu, Philippe

    2010-04-01

    Within-subject analysis in fMRI essentially addresses two problems, the detection of brain regions eliciting evoked activity and the estimation of the underlying dynamics. In Makni et aL, 2005 and Makni et aL, 2008, a detection-estimation framework has been proposed to tackle these problems jointly, since they are connected to one another. In the Bayesian formalism, detection is achieved by modeling activating and nonactivating voxels through independent mixture models (IMM) within each region while hemodynamic response estimation is performed at a regional scale in a nonparametric way. Instead of IMMs, in this paper we take advantage of spatial mixture models (SMM) for their nonlinear spatial regularizing properties. The proposed method is unsupervised and spatially adaptive in the sense that the amount of spatial correlation is automatically tuned from the data and this setting automatically varies across brain regions. In addition, the level of regularization is specific to each experimental condition since both the signal-to-noise ratio and the activation pattern may vary across stimulus types in a given brain region. These aspects require the precise estimation of multiple partition functions of underlying Ising fields. This is addressed efficiently using first path sampling for a small subset of fields and then using a recently developed fast extrapolation technique for the large remaining set. Simulation results emphasize that detection relying on supervised SMM outperforms its IMM counterpart and that unsupervised spatial mixture models achieve similar results without any hand-tuning of the correlation parameter. On real datasets, the gain is illustrated in a localizer fMRI experiment: brain activations appear more spatially resolved using SMM in comparison with classical general linear model (GLM)-based approaches, while estimating a specific parcel-based HRF shape. Our approach therefore validates the treatment of unsmoothed fMRI data without fixed GLM

  20. Visibility graph analysis of heart rate time series and bio-marker of congestive heart failure

    Science.gov (United States)

    Bhaduri, Anirban; Bhaduri, Susmita; Ghosh, Dipak

    2017-09-01

    Study of RR interval time series for Congestive Heart Failure had been an area of study with different methods including non-linear methods. In this article the cardiac dynamics of heart beat are explored in the light of complex network analysis, viz. visibility graph method. Heart beat (RR Interval) time series data taken from Physionet database [46, 47] belonging to two groups of subjects, diseased (congestive heart failure) (29 in number) and normal (54 in number) are analyzed with the technique. The overall results show that a quantitative parameter can significantly differentiate between the diseased subjects and the normal subjects as well as different stages of the disease. Further, the data when split into periods of around 1 hour each and analyzed separately, also shows the same consistent differences. This quantitative parameter obtained using the visibility graph analysis thereby can be used as a potential bio-marker as well as a subsequent alarm generation mechanism for predicting the onset of Congestive Heart Failure.

  1. Real analysis modern techniques and their applications

    CERN Document Server

    Folland, Gerald B

    1999-01-01

    An in-depth look at real analysis and its applications-now expanded and revised.This new edition of the widely used analysis book continues to cover real analysis in greater detail and at a more advanced level than most books on the subject. Encompassing several subjects that underlie much of modern analysis, the book focuses on measure and integration theory, point set topology, and the basics of functional analysis. It illustrates the use of the general theories and introduces readers to other branches of analysis such as Fourier analysis, distribution theory, and probability theory.This edi

  2. Melodic pattern extraction in large collections of music recordings using time series mining techniques

    OpenAIRE

    Gulati, Sankalp; Serrà, Joan; Ishwar, Vignesh; Serra, Xavier

    2014-01-01

    We demonstrate a data-driven unsupervised approach for the discovery of melodic patterns in large collections of Indian art music recordings. The approach first works on single recordings and subsequently searches in the entire music collection. Melodic similarity is based on dynamic time warping. The task being computationally intensive, lower bounding and early abandoning techniques are applied during distance computation. Our dataset comprises 365 hours of music, containing 1,764 audio rec...

  3. SPI Trend Analysis of New Zealand Applying the ITA Technique

    Directory of Open Access Journals (Sweden)

    Tommaso Caloiero

    2018-03-01

    Full Text Available A natural temporary imbalance of water availability, consisting of persistent lower-than-average or higher-than-average precipitation, can cause extreme dry and wet conditions that adversely impact agricultural yields, water resources, infrastructure, and human systems. In this study, dry and wet periods in New Zealand were expressed using the Standardized Precipitation Index (SPI. First, both the short term (3 and 6 months and the long term (12 and 24 months SPI were estimated, and then, possible trends in the SPI values were detected by means of a new graphical technique, the Innovative Trend Analysis (ITA, which allows the trend identification of the low, medium, and high values of a series. Results show that, in every area currently subject to drought, an increase in this phenomenon can be expected. Specifically, the results of this paper highlight that agricultural regions on the eastern side of the South Island, as well as the north-eastern regions of the North Island, are the most consistently vulnerable areas. In fact, in these regions, the trend analysis mainly showed a general reduction in all the values of the SPI: that is, a tendency toward heavier droughts and weaker wet periods.

  4. Use of a Barbed Suture Tie-Over Technique for Skin Graft Dressings: A Case Series

    Directory of Open Access Journals (Sweden)

    Kenneth M Joyce

    2015-05-01

    Full Text Available BackgroundA tie-over dressing is the accepted method to secure skin grafts in order to prevent haematoma or seroma formation. We describe the novel application of a barbed suture tie-over for skin graft dressing. The barbs act as anchors in the skin so constant tensioning of the suture is not required.MethodsFrom January 2014 to August 2014 we used the technique in 30 patients with skin defects requiring split-thickness or full-thickness grafts. Patient demographics, clinicopathological details and graft outcome were collected prospectively.ResultsThe majority of cases were carried out for split-thickness skin grafts (n=19 used on the lower limb (n=20. The results of this novel technique were excellent with complete (100% graft take in all patients.ConclusionsOur results demonstrate the clinical application of a barbed device for securing skin grafts with excellent results. We find the technique quick to perform and the barbed device easy to handle, which can be applied without the need for an assistant.

  5. Techniques involving extreme environment, nondestructive techniques, computer methods in metals research, and data analysis

    International Nuclear Information System (INIS)

    Bunshah, R.F.

    1976-01-01

    A number of different techniques which range over several different aspects of materials research are covered in this volume. They are concerned with property evaluation of 4 0 K and below, surface characterization, coating techniques, techniques for the fabrication of composite materials, computer methods, data evaluation and analysis, statistical design of experiments and non-destructive test techniques. Topics covered in this part include internal friction measurements; nondestructive testing techniques; statistical design of experiments and regression analysis in metallurgical research; and measurement of surfaces of engineering materials

  6. Modeling Philippine Stock Exchange Composite Index Using Time Series Analysis

    Science.gov (United States)

    Gayo, W. S.; Urrutia, J. D.; Temple, J. M. F.; Sandoval, J. R. D.; Sanglay, J. E. A.

    2015-06-01

    This study was conducted to develop a time series model of the Philippine Stock Exchange Composite Index and its volatility using the finite mixture of ARIMA model with conditional variance equations such as ARCH, GARCH, EG ARCH, TARCH and PARCH models. Also, the study aimed to find out the reason behind the behaviorof PSEi, that is, which of the economic variables - Consumer Price Index, crude oil price, foreign exchange rate, gold price, interest rate, money supply, price-earnings ratio, Producers’ Price Index and terms of trade - can be used in projecting future values of PSEi and this was examined using Granger Causality Test. The findings showed that the best time series model for Philippine Stock Exchange Composite index is ARIMA(1,1,5) - ARCH(1). Also, Consumer Price Index, crude oil price and foreign exchange rate are factors concluded to Granger cause Philippine Stock Exchange Composite Index.

  7. Time series analysis of the behavior of brazilian natural rubber

    Directory of Open Access Journals (Sweden)

    Antônio Donizette de Oliveira

    2009-03-01

    Full Text Available The natural rubber is a non-wood product obtained of the coagulation of some lattices of forest species, being Hevea brasiliensis the main one. Native from the Amazon Region, this species was already known by the Indians before the discovery of America. The natural rubber became a product globally valued due to its multiple applications in the economy, being its almost perfect substitute the synthetic rubber derived from the petroleum. Similarly to what happens with other countless products the forecast of future prices of the natural rubber has been object of many studies. The use of models of forecast of univariate timeseries stands out as the more accurate and useful to reduce the uncertainty in the economic decision making process. This studyanalyzed the historical series of prices of the Brazilian natural rubber (R$/kg, in the Jan/99 - Jun/2006 period, in order tocharacterize the rubber price behavior in the domestic market; estimated a model for the time series of monthly natural rubberprices; and foresaw the domestic prices of the natural rubber, in the Jul/2006 - Jun/2007 period, based on the estimated models.The studied models were the ones belonging to the ARIMA family. The main results were: the domestic market of the natural rubberis expanding due to the growth of the world economy; among the adjusted models, the ARIMA (1,1,1 model provided the bestadjustment of the time series of prices of the natural rubber (R$/kg; the prognosis accomplished for the series supplied statistically adequate fittings.

  8. Financial time series analysis based on information categorization method

    Science.gov (United States)

    Tian, Qiang; Shang, Pengjian; Feng, Guochen

    2014-12-01

    The paper mainly applies the information categorization method to analyze the financial time series. The method is used to examine the similarity of different sequences by calculating the distances between them. We apply this method to quantify the similarity of different stock markets. And we report the results of similarity in US and Chinese stock markets in periods 1991-1998 (before the Asian currency crisis), 1999-2006 (after the Asian currency crisis and before the global financial crisis), and 2007-2013 (during and after global financial crisis) by using this method. The results show the difference of similarity between different stock markets in different time periods and the similarity of the two stock markets become larger after these two crises. Also we acquire the results of similarity of 10 stock indices in three areas; it means the method can distinguish different areas' markets from the phylogenetic trees. The results show that we can get satisfactory information from financial markets by this method. The information categorization method can not only be used in physiologic time series, but also in financial time series.

  9. Application of functional analysis techniques to supervisory systems

    International Nuclear Information System (INIS)

    Lambert, Manuel; Riera, Bernard; Martel, Gregory

    1999-01-01

    The aim of this paper is to apply firstly two interesting functional analysis techniques for the design of supervisory systems for complex processes, and secondly to discuss the strength and the weaknesses of each of them. Two functional analysis techniques have been applied, SADT (Structured Analysis and Design Technique) and FAST (Functional Analysis System Technique) on a process, an example of a Water Supply Process Control (WSPC) system. These techniques allow a functional description of industrial processes. The paper briefly discusses the functions of a supervisory system and some advantages of the application of functional analysis for the design of a 'human' centered supervisory system. Then the basic principles of the two techniques applied on the WSPC system are presented. Finally, the different results obtained from the two techniques are discussed

  10. Characterization of Ground Displacement Sources from Variational Bayesian Independent Component Analysis of Space Geodetic Time Series

    Science.gov (United States)

    Gualandi, Adriano; Serpelloni, Enrico; Elina Belardinelli, Maria; Bonafede, Maurizio; Pezzo, Giuseppe; Tolomei, Cristiano

    2015-04-01

    A critical point in the analysis of ground displacement time series, as those measured by modern space geodetic techniques (primarly continuous GPS/GNSS and InSAR) is the development of data driven methods that allow to discern and characterize the different sources that generate the observed displacements. A widely used multivariate statistical technique is the Principal Component Analysis (PCA), which allows to reduce the dimensionality of the data space maintaining most of the variance of the dataset explained. It reproduces the original data using a limited number of Principal Components, but it also shows some deficiencies, since PCA does not perform well in finding the solution to the so-called Blind Source Separation (BSS) problem. The recovering and separation of the different sources that generate the observed ground deformation is a fundamental task in order to provide a physical meaning to the possible different sources. PCA fails in the BSS problem since it looks for a new Euclidean space where the projected data are uncorrelated. Usually, the uncorrelation condition is not strong enough and it has been proven that the BSS problem can be tackled imposing on the components to be independent. The Independent Component Analysis (ICA) is, in fact, another popular technique adopted to approach this problem, and it can be used in all those fields where PCA is also applied. An ICA approach enables us to explain the displacement time series imposing a fewer number of constraints on the model, and to reveal anomalies in the data such as transient deformation signals. However, the independence condition is not easy to impose, and it is often necessary to introduce some approximations. To work around this problem, we use a variational bayesian ICA (vbICA) method, which models the probability density function (pdf) of each source signal using a mix of Gaussian distributions. This technique allows for more flexibility in the description of the pdf of the sources

  11. IMAGE ANALYSIS BASED ON EDGE DETECTION TECHNIQUES

    Institute of Scientific and Technical Information of China (English)

    纳瑟; 刘重庆

    2002-01-01

    A method that incorporates edge detection technique, Markov Random field (MRF), watershed segmentation and merging techniques was presented for performing image segmentation and edge detection tasks. It first applies edge detection technique to obtain a Difference In Strength (DIS) map. An initial segmented result is obtained based on K-means clustering technique and the minimum distance. Then the region process is modeled by MRF to obtain an image that contains different intensity regions. The gradient values are calculated and then the watershed technique is used. DIS calculation is used for each pixel to define all the edges (weak or strong) in the image. The DIS map is obtained. This help as priority knowledge to know the possibility of the region segmentation by the next step (MRF), which gives an image that has all the edges and regions information. In MRF model,gray level l, at pixel location i, in an image X, depends on the gray levels of neighboring pixels. The segmentation results are improved by using watershed algorithm. After all pixels of the segmented regions are processed, a map of primitive region with edges is generated. The edge map is obtained using a merge process based on averaged intensity mean values. A common edge detectors that work on (MRF) segmented image are used and the results are compared. The segmentation and edge detection result is one closed boundary per actual region in the image.

  12. Fourier series analysis of a cylindrical pressure vessel subjected to axial end load and external pressure

    International Nuclear Information System (INIS)

    Brar, Gurinder Singh; Hari, Yogeshwar; Williams, Dennis K.

    2013-01-01

    This paper presents the comparison of a reliability technique that employs a Fourier series representation of random axisymmetric and asymmetric imperfections in a cylindrical pressure vessel subjected to an axial end load and external pressure, with evaluations prescribed by the ASME Boiler and Pressure Vessel Code, Section VIII, Division 2 Rules. The ultimate goal of the reliability technique described herein is to predict the critical buckling load associated with the subject cylindrical pressure vessel. Initial geometric imperfections are shown to have a significant effect on the calculated load carrying capacity of the vessel. Fourier decomposition was employed to interpret imperfections as structural features that can be easily related to various other types of defined imperfections. The initial functional description of the imperfections consists of an axisymmetric portion and a deviant portion, which are availed in the form of a double Fourier series. Fifty simulated shells generated by the Monte Carlo technique are employed in the final prediction of the critical buckling load. The representation of initial geometrical imperfections in the cylindrical pressure vessel requires the determination of respective Fourier coefficients. Multi-mode analyses are expanded to evaluate a large number of potential buckling modes for both predefined geometries in combination with asymmetric imperfections as a function of position within the given cylindrical shell. The probability of the ultimate buckling stress exceeding a predefined threshold stress is also calculated. The method and results described herein are in stark contrast to the “knockdown factor” approach as applied to compressive stress evaluations currently utilized in industry. Further effort is needed to improve on the current design rules regarding column buckling of large diameter pressure vessels subjected to an axial end load and external pressure designed in accordance with ASME Boiler and

  13. Data Analysis Techniques for Physical Scientists

    Science.gov (United States)

    Pruneau, Claude A.

    2017-10-01

    Preface; How to read this book; 1. The scientific method; Part I. Foundation in Probability and Statistics: 2. Probability; 3. Probability models; 4. Classical inference I: estimators; 5. Classical inference II: optimization; 6. Classical inference III: confidence intervals and statistical tests; 7. Bayesian inference; Part II. Measurement Techniques: 8. Basic measurements; 9. Event reconstruction; 10. Correlation functions; 11. The multiple facets of correlation functions; 12. Data correction methods; Part III. Simulation Techniques: 13. Monte Carlo methods; 14. Collision and detector modeling; List of references; Index.

  14. Event-sequence time series analysis in ground-based gamma-ray astronomy

    International Nuclear Information System (INIS)

    Barres de Almeida, U.; Chadwick, P.; Daniel, M.; Nolan, S.; McComb, L.

    2008-01-01

    The recent, extreme episodes of variability detected from Blazars by the leading atmospheric Cerenkov experiments motivate the development and application of specialized statistical techniques that enable the study of this rich data set to its furthest extent. The identification of the shortest variability timescales supported by the data and the actual variability structure observed in the light curves of these sources are some of the fundamental aspects being studied, that answers can bring new developments on the understanding of the physics of these objects and on the mechanisms of production of VHE gamma-rays in the Universe. Some of our efforts in studying the time variability of VHE sources involve the application of dynamic programming algorithms to the problem of detecting change-points in a Poisson sequence. In this particular paper we concentrate on the more primary issue of the applicability of counting statistics to the analysis of time-series on VHE gamma-ray astronomy.

  15. Surface analysis and techniques in biology

    CERN Document Server

    Smentkowski, Vincent S

    2014-01-01

    This book highlights state-of-the-art surface analytical instrumentation, advanced data analysis tools, and the use of complimentary surface analytical instrumentation to perform a complete analysis of biological systems.

  16. Time-Series Analysis of Supergranule Characterstics at Solar Minimum

    Science.gov (United States)

    Williams, Peter E.; Pesnell, W. Dean

    2013-01-01

    Sixty days of Doppler images from the Solar and Heliospheric Observatory (SOHO) / Michelson Doppler Imager (MDI) investigation during the 1996 and 2008 solar minima have been analyzed to show that certain supergranule characteristics (size, size range, and horizontal velocity) exhibit fluctuations of three to five days. Cross-correlating parameters showed a good, positive correlation between supergranulation size and size range, and a moderate, negative correlation between size range and velocity. The size and velocity do exhibit a moderate, negative correlation, but with a small time lag (less than 12 hours). Supergranule sizes during five days of co-temporal data from MDI and the Solar Dynamics Observatory (SDO) / Helioseismic Magnetic Imager (HMI) exhibit similar fluctuations with a high level of correlation between them. This verifies the solar origin of the fluctuations, which cannot be caused by instrumental artifacts according to these observations. Similar fluctuations are also observed in data simulations that model the evolution of the MDI Doppler pattern over a 60-day period. Correlations between the supergranule size and size range time-series derived from the simulated data are similar to those seen in MDI data. A simple toy-model using cumulative, uncorrelated exponential growth and decay patterns at random emergence times produces a time-series similar to the data simulations. The qualitative similarities between the simulated and the observed time-series suggest that the fluctuations arise from stochastic processes occurring within the solar convection zone. This behavior, propagating to surface manifestations of supergranulation, may assist our understanding of magnetic-field-line advection, evolution, and interaction.

  17. The "Chaos Theory" and nonlinear dynamics in heart rate variability analysis: does it work in short-time series in patients with coronary heart disease?

    Science.gov (United States)

    Krstacic, Goran; Krstacic, Antonija; Smalcelj, Anton; Milicic, Davor; Jembrek-Gostovic, Mirjana

    2007-04-01

    Dynamic analysis techniques may quantify abnormalities in heart rate variability (HRV) based on nonlinear and fractal analysis (chaos theory). The article emphasizes clinical and prognostic significance of dynamic changes in short-time series applied on patients with coronary heart disease (CHD) during the exercise electrocardiograph (ECG) test. The subjects were included in the series after complete cardiovascular diagnostic data. Series of R-R and ST-T intervals were obtained from exercise ECG data after sampling digitally. The range rescaled analysis method determined the fractal dimension of the intervals. To quantify fractal long-range correlation's properties of heart rate variability, the detrended fluctuation analysis technique was used. Approximate entropy (ApEn) was applied to quantify the regularity and complexity of time series, as well as unpredictability of fluctuations in time series. It was found that the short-term fractal scaling exponent (alpha(1)) is significantly lower in patients with CHD (0.93 +/- 0.07 vs 1.09 +/- 0.04; P chaos theory during the exercise ECG test point out the multifractal time series in CHD patients who loss normal fractal characteristics and regularity in HRV. Nonlinear analysis technique may complement traditional ECG analysis.

  18. Applicability of contact angle techniques used in the analysis of contact lenses, part 1: comparative methodologies.

    Science.gov (United States)

    Campbell, Darren; Carnell, Sarah Maria; Eden, Russell John

    2013-05-01

    Contact angle, as a representative measure of surface wettability, is often employed to interpret contact lens surface properties. The literature is often contradictory and can lead to confusion. This literature review is part of a series regarding the analysis of hydrogel contact lenses using contact angle techniques. Here we present an overview of contact angle terminology, methodology, and analysis. Having discussed this background material, subsequent parts of the series will discuss the analysis of contact lens contact angles and evaluate differences in published laboratory results. The concepts of contact angle, wettability and wetting are presented as an introduction. Contact angle hysteresis is outlined and highlights the advantages in using dynamic analytical techniques over static methods. The surface free energy of a material illustrates how contact angle analysis is capable of providing supplementary surface characterization. Although single values are able to distinguish individual material differences, surface free energy and dynamic methods provide an improved understanding of material behavior. The frequently used sessile drop, captive bubble, and Wilhelmy plate techniques are discussed. Their use as both dynamic and static methods, along with the advantages and disadvantages of each technique, is explained. No single contact angle technique fully characterizes the wettability of a material surface, and the application of complimenting methods allows increased characterization. At present, there is not an ISO standard method designed for soft materials. It is important that each contact angle technique has a standard protocol, as small protocol differences between laboratories often contribute to a variety of published data that are not easily comparable.

  19. Comparison between scalpel technique and electrosurgery for depigmentation: A case series

    Directory of Open Access Journals (Sweden)

    B M Bhusari

    2011-01-01

    Full Text Available A beautiful smile definitely enhances the personality of an individual and reveals self-confidence. The harmony of the smile is determined not only by the shape, position, and color of the teeth but also by the gingival tissues. Gingival pigmentation results from melanin granules which are produced by melanoblasts. Although melanin pigmentation of the gingiva is a completely benign condition and does not pose any medical problem, complaints of "black gums" are common particularly in patients having a very high smile line. The different treatment modalities that have been reported for depigmentation are bur abrasion, partial thickness flap, cryotherapy, electrosurgery, and lasers. In this paper we have compared the results of electrosurgery and scalpel technique, i.e., partial thickness flap.

  20. Technique optimization of orbital atherectomy in calcified peripheral lesions of the lower extremities: the CONFIRM series, a prospective multicenter registry.

    Science.gov (United States)

    Das, Tony; Mustapha, Jihad; Indes, Jeffrey; Vorhies, Robert; Beasley, Robert; Doshi, Nilesh; Adams, George L

    2014-01-01

    The purpose of CONFIRM registry series was to evaluate the use of orbital atherectomy (OA) in peripheral lesions of the lower extremities, as well as optimize the technique of OA. Methods of treating calcified arteries (historically a strong predictor of treatment failure) have improved significantly over the past decade and now include minimally invasive endovascular treatments, such as OA with unique versatility in modifying calcific lesions above and below-the-knee. Patients (3135) undergoing OA by more than 350 physicians at over 200 US institutions were enrolled on an "all-comers" basis, resulting in registries that provided site-reported patient demographics, ABI, Rutherford classification, co-morbidities, lesion characteristics, plaque morphology, device usage parameters, and procedural outcomes. Treatment with OA reduced pre-procedural stenosis from an average of 88-35%. Final residual stenosis after adjunctive treatments, typically low-pressure percutaneous transluminal angioplasty (PTA), averaged 10%. Plaque removal was most effective for severely calcified lesions and least effective for soft plaque. Shorter spin times and smaller crown sizes significantly lowered procedural complications which included slow flow (4.4%), embolism (2.2%), and spasm (6.3%), emphasizing the importance of treatment regimens that focus on plaque modification over maximizing luminal gain. The OA technique optimization, which resulted in a change of device usage across the CONFIRM registry series, corresponded to a lower incidence of adverse events irrespective of calcium burden or co-morbidities. Copyright © 2013 The Authors. Wiley Periodicals, Inc.

  1. Visibility graph analysis on quarterly macroeconomic series of China based on complex network theory

    Science.gov (United States)

    Wang, Na; Li, Dong; Wang, Qiwen

    2012-12-01

    The visibility graph approach and complex network theory provide a new insight into time series analysis. The inheritance of the visibility graph from the original time series was further explored in the paper. We found that degree distributions of visibility graphs extracted from Pseudo Brownian Motion series obtained by the Frequency Domain algorithm exhibit exponential behaviors, in which the exponential exponent is a binomial function of the Hurst index inherited in the time series. Our simulations presented that the quantitative relations between the Hurst indexes and the exponents of degree distribution function are different for different series and the visibility graph inherits some important features of the original time series. Further, we convert some quarterly macroeconomic series including the growth rates of value-added of three industry series and the growth rates of Gross Domestic Product series of China to graphs by the visibility algorithm and explore the topological properties of graphs associated from the four macroeconomic series, namely, the degree distribution and correlations, the clustering coefficient, the average path length, and community structure. Based on complex network analysis we find degree distributions of associated networks from the growth rates of value-added of three industry series are almost exponential and the degree distributions of associated networks from the growth rates of GDP series are scale free. We also discussed the assortativity and disassortativity of the four associated networks as they are related to the evolutionary process of the original macroeconomic series. All the constructed networks have “small-world” features. The community structures of associated networks suggest dynamic changes of the original macroeconomic series. We also detected the relationship among government policy changes, community structures of associated networks and macroeconomic dynamics. We find great influences of government

  2. Anatomy of the ICDS series: A bibliometric analysis

    International Nuclear Information System (INIS)

    Cardona, Manuel; Marx, Werner

    2007-01-01

    In this article, the proceedings of the International Conferences on Defects in Semiconductors (ICDS) have been analyzed by bibliometric methods. The papers of these conferences have been published as articles in regular journals or special proceedings journals and in books with diverse publishers. The conference name/title changed several times. Many of the proceedings did not appear in the so-called 'source journals' covered by the Thomson/ISI citation databases, in particular by the Science Citation Index (SCI). But the number of citations within these source journals can be determined using the Cited Reference Search mode under the Web of Science (WoS) and the SCI offered by the host STN International. The search functions of both systems were needed to select the papers published as different document types and to cover the full time span of the series. The most cited ICDS papers were identified, and the overall numbers of citations as well as the time-dependent impact of these papers, of single conferences, and of the complete series, was established. The complete of citing papers was analyzed with respect to the countries of the citing authors, the citing journals, and the ISI subject categories

  3. Discontinuous conduction mode analysis of phase-modulated series ...

    Indian Academy of Sciences (India)

    Utsab Kundu

    domain analysis; frequency domain analysis; critical load resistance. 1. Introduction ... DCMSRC design process, requiring repeated circuit simu- lations for design ... Structured derivation of Av is presented, ..... System specifications. L. C r. Lm.

  4. Survey of immunoassay techniques for biological analysis

    International Nuclear Information System (INIS)

    Burtis, C.A.

    1986-10-01

    Immunoassay is a very specific, sensitive, and widely applicable analytical technique. Recent advances in genetic engineering have led to the development of monoclonal antibodies which further improves the specificity of immunoassays. Originally, radioisotopes were used to label the antigens and antibodies used in immunoassays. However, in the last decade, numerous types of immunoassays have been developed which utilize enzymes and fluorescent dyes as labels. Given the technical, safety, health, and disposal problems associated with using radioisotopes, immunoassays that utilize the enzyme and fluorescent labels are rapidly replacing those using radioisotope labels. These newer techniques are as sensitive, are easily automated, have stable reagents, and do not have a disposal problem. 6 refs., 1 fig., 2 tabs

  5. Trend Estimation and Regression Analysis in Climatological Time Series: An Application of Structural Time Series Models and the Kalman Filter.

    Science.gov (United States)

    Visser, H.; Molenaar, J.

    1995-05-01

    The detection of trends in climatological data has become central to the discussion on climate change due to the enhanced greenhouse effect. To prove detection, a method is needed (i) to make inferences on significant rises or declines in trends, (ii) to take into account natural variability in climate series, and (iii) to compare output from GCMs with the trends in observed climate data. To meet these requirements, flexible mathematical tools are needed. A structural time series model is proposed with which a stochastic trend, a deterministic trend, and regression coefficients can be estimated simultaneously. The stochastic trend component is described using the class of ARIMA models. The regression component is assumed to be linear. However, the regression coefficients corresponding with the explanatory variables may be time dependent to validate this assumption. The mathematical technique used to estimate this trend-regression model is the Kaiman filter. The main features of the filter are discussed.Examples of trend estimation are given using annual mean temperatures at a single station in the Netherlands (1706-1990) and annual mean temperatures at Northern Hemisphere land stations (1851-1990). The inclusion of explanatory variables is shown by regressing the latter temperature series on four variables: Southern Oscillation index (SOI), volcanic dust index (VDI), sunspot numbers (SSN), and a simulated temperature signal, induced by increasing greenhouse gases (GHG). In all analyses, the influence of SSN on global temperatures is found to be negligible. The correlations between temperatures and SOI and VDI appear to be negative. For SOI, this correlation is significant, but for VDI it is not, probably because of a lack of volcanic eruptions during the sample period. The relation between temperatures and GHG is positive, which is in agreement with the hypothesis of a warming climate because of increasing levels of greenhouse gases. The prediction performance of

  6. Hybrid chemical and nondestructive-analysis technique

    International Nuclear Information System (INIS)

    Hsue, S.T.; Marsh, S.F.; Marks, T.

    1982-01-01

    A hybrid chemical/NDA technique has been applied at the Los Alamos National Laboratory to the assay of plutonium in ion-exchange effluents. Typical effluent solutions contain low concentrations of plutonium and high concentrations of americium. A simple trioctylphosphine oxide (TOPO) separation can remove 99.9% of the americium. The organic phase that contains the separated plutonium can be accurately assayed by monitoring the uranium L x-ray intensities

  7. Validation of non-stationary precipitation series for site-specific impact assessment: comparison of two statistical downscaling techniques

    Science.gov (United States)

    Mullan, Donal; Chen, Jie; Zhang, Xunchang John

    2016-02-01

    Statistical downscaling (SD) methods have become a popular, low-cost and accessible means of bridging the gap between the coarse spatial resolution at which climate models output climate scenarios and the finer spatial scale at which impact modellers require these scenarios, with various different SD techniques used for a wide range of applications across the world. This paper compares the Generator for Point Climate Change (GPCC) model and the Statistical DownScaling Model (SDSM)—two contrasting SD methods—in terms of their ability to generate precipitation series under non-stationary conditions across ten contrasting global climates. The mean, maximum and a selection of distribution statistics as well as the cumulative frequencies of dry and wet spells for four different temporal resolutions were compared between the models and the observed series for a validation period. Results indicate that both methods can generate daily precipitation series that generally closely mirror observed series for a wide range of non-stationary climates. However, GPCC tends to overestimate higher precipitation amounts, whilst SDSM tends to underestimate these. This infers that GPCC is more likely to overestimate the effects of precipitation on a given impact sector, whilst SDSM is likely to underestimate the effects. GPCC performs better than SDSM in reproducing wet and dry day frequency, which is a key advantage for many impact sectors. Overall, the mixed performance of the two methods illustrates the importance of users performing a thorough validation in order to determine the influence of simulated precipitation on their chosen impact sector.

  8. Analysis of series resistance effects on forward I - V and C - V characteristics of mis type diodes

    International Nuclear Information System (INIS)

    Altindal, S.; Tekeli, Z.; Karadeniz, S.; Tugluoglu, N.; Ercan, I.

    2002-01-01

    In order to determine the series resistance R s , we have followed Lie et al., Cheung et al. and Kang et al., from the plot of I vs dV/dLn(I) which was linear curve over a wide range of current values at each temperature. The values of Rs were obtained from the slope of the linear parts of the curves and then the series resistance at each temperature has been evaluated at Ln(I) vs (V-IR s ) curves. The curves are linear over a wide range of voltage. The most reliable values of ideality factor n and reverse saturation current Is were then determined. In addition to role of series resistance on the C-V and G-V characteristics of diode have been investigated. Both C-V and G-V measurements show that the measured capacitance and conductance seriously varies with applied bias and frequency due to presence of R s . The density of interface states, barrier height and series resistance from the forward bias I-V characteristics using this method agrees very well with that obtained from the capacitance technique. It is clear that ignoring the series resistance (device with high series resistance) can lead to significant errors in the analysis of the I-V-T, C-V-f and G-V-f characteristics

  9. Data analysis techniques for gravitational wave observations

    Indian Academy of Sciences (India)

    Astrophysical sources of gravitational waves fall broadly into three categories: (i) transient and bursts, (ii) periodic or continuous wave and (iii) stochastic. Each type of source requires a different type of data analysis strategy. In this talk various data analysis strategies will be reviewed. Optimal filtering is used for extracting ...

  10. The Analysis Of Personality Disorder On Two Characters In The Animation Series Black Rock Shooter

    OpenAIRE

    Ramadhana, Rizki Andrian

    2015-01-01

    The title of this thesis is The Analysis of Personality Disorder on Two Characters in the Animation Series “Black Rock Shooter” which discusses about the personality disorder of two characters from this series; they are Kagari Izuriha and Yomi Takanashi. The animation series Black Rock Shooter is chosen as the source of data because this animation has psychological genre and represents the complexity of human relationship, especially when build up a friendship. It is because human is a social...

  11. Ridge Preservation with Modified “Socket-Shield” Technique: A Methodological Case Series

    Directory of Open Access Journals (Sweden)

    Markus Glocker

    2014-01-01

    Full Text Available After tooth extraction, the alveolar bone undergoes a remodeling process, which leads to horizontal and vertical bone loss. These resorption processes complicate dental rehabilitation, particularly in connection with implants. Various methods of guided bone regeneration (GBR have been described to retain the original dimension of the bone after extraction. Most procedures use filler materials and membranes to support the buccal plate and soft tissue, to stabilize the coagulum and to prevent epithelial ingrowth. It has also been suggested that resorption of the buccal bundle bone can be avoided by leaving a buccal root segment (socket shield technique in place, because the biological integrity of the buccal periodontium (bundle bone remains untouched. This method has also been described in connection with immediate implant placement. The present case report describes three consecutive cases in which a modified method was applied as part of a delayed implantation. The latter was carried out after six months, and during re-entry the new bone formation in the alveolar bone and the residual ridge was clinically evaluated as proof of principle. It was demonstrated that the bone was clinically preserved with this method. Possibilities and limitations are discussed and directions for future research are disclosed.

  12. Multi-Scale Entropy Analysis as a Method for Time-Series Analysis of Climate Data

    Directory of Open Access Journals (Sweden)

    Heiko Balzter

    2015-03-01

    Full Text Available Evidence is mounting that the temporal dynamics of the climate system are changing at the same time as the average global temperature is increasing due to multiple climate forcings. A large number of extreme weather events such as prolonged cold spells, heatwaves, droughts and floods have been recorded around the world in the past 10 years. Such changes in the temporal scaling behaviour of climate time-series data can be difficult to detect. While there are easy and direct ways of analysing climate data by calculating the means and variances for different levels of temporal aggregation, these methods can miss more subtle changes in their dynamics. This paper describes multi-scale entropy (MSE analysis as a tool to study climate time-series data and to identify temporal scales of variability and their change over time in climate time-series. MSE estimates the sample entropy of the time-series after coarse-graining at different temporal scales. An application of MSE to Central European, variance-adjusted, mean monthly air temperature anomalies (CRUTEM4v is provided. The results show that the temporal scales of the current climate (1960–2014 are different from the long-term average (1850–1960. For temporal scale factors longer than 12 months, the sample entropy increased markedly compared to the long-term record. Such an increase can be explained by systems theory with greater complexity in the regional temperature data. From 1961 the patterns of monthly air temperatures are less regular at time-scales greater than 12 months than in the earlier time period. This finding suggests that, at these inter-annual time scales, the temperature variability has become less predictable than in the past. It is possible that climate system feedbacks are expressed in altered temporal scales of the European temperature time-series data. A comparison with the variance and Shannon entropy shows that MSE analysis can provide additional information on the

  13. A Multivariate Time Series Method for Monte Carlo Reactor Analysis

    International Nuclear Information System (INIS)

    Taro Ueki

    2008-01-01

    A robust multivariate time series method has been established for the Monte Carlo calculation of neutron multiplication problems. The method is termed Coarse Mesh Projection Method (CMPM) and can be implemented using the coarse statistical bins for acquisition of nuclear fission source data. A novel aspect of CMPM is the combination of the general technical principle of projection pursuit in the signal processing discipline and the neutron multiplication eigenvalue problem in the nuclear engineering discipline. CMPM enables reactor physicists to accurately evaluate major eigenvalue separations of nuclear reactors with continuous energy Monte Carlo calculation. CMPM was incorporated in the MCNP Monte Carlo particle transport code of Los Alamos National Laboratory. The great advantage of CMPM over the traditional Fission Matrix method is demonstrated for the three space-dimensional modeling of the initial core of a pressurized water reactor

  14. Visualization techniques for malware behavior analysis

    Science.gov (United States)

    Grégio, André R. A.; Santos, Rafael D. C.

    2011-06-01

    Malware spread via Internet is a great security threat, so studying their behavior is important to identify and classify them. Using SSDT hooking we can obtain malware behavior by running it in a controlled environment and capturing interactions with the target operating system regarding file, process, registry, network and mutex activities. This generates a chain of events that can be used to compare them with other known malware. In this paper we present a simple approach to convert malware behavior into activity graphs and show some visualization techniques that can be used to analyze malware behavior, individually or grouped.

  15. INVERSE FILTERING TECHNIQUES IN SPEECH ANALYSIS

    African Journals Online (AJOL)

    Dr Obe

    domain or in the frequency domain. However their .... computer to speech analysis led to important elaborations ... tool for the estimation of formant trajectory (10), ... prediction Linear prediction In effect determines the filter .... Radio Res. Lab.

  16. Techniques for Intelligence Analysis of Networks

    National Research Council Canada - National Science Library

    Cares, Jeffrey R

    2005-01-01

    ...) there are significant intelligence analysis manifestations of these properties; and (4) a more satisfying theory of Networked Competition than currently exists for NCW/NCO is emerging from this research...

  17. The Network Protocol Analysis Technique in Snort

    Science.gov (United States)

    Wu, Qing-Xiu

    Network protocol analysis is a network sniffer to capture data for further analysis and understanding of the technical means necessary packets. Network sniffing is intercepted by packet assembly binary format of the original message content. In order to obtain the information contained. Required based on TCP / IP protocol stack protocol specification. Again to restore the data packets at protocol format and content in each protocol layer. Actual data transferred, as well as the application tier.

  18. Uncertainty analysis technique for OMEGA Dante measurementsa)

    Science.gov (United States)

    May, M. J.; Widmann, K.; Sorce, C.; Park, H.-S.; Schneider, M.

    2010-10-01

    The Dante is an 18 channel x-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g., hohlraums, etc.) at x-ray energies between 50 eV and 10 keV. It is a main diagnostic installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the x-ray diodes, filters and mirrors, and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determined flux using a Monte Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.

  19. Uncertainty analysis technique for OMEGA Dante measurements

    International Nuclear Information System (INIS)

    May, M. J.; Widmann, K.; Sorce, C.; Park, H.-S.; Schneider, M.

    2010-01-01

    The Dante is an 18 channel x-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g., hohlraums, etc.) at x-ray energies between 50 eV and 10 keV. It is a main diagnostic installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the x-ray diodes, filters and mirrors, and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determined flux using a Monte Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.

  20. Uncertainty Analysis Technique for OMEGA Dante Measurements

    International Nuclear Information System (INIS)

    May, M.J.; Widmann, K.; Sorce, C.; Park, H.; Schneider, M.

    2010-01-01

    The Dante is an 18 channel X-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g. hohlraums, etc.) at X-ray energies between 50 eV to 10 keV. It is a main diagnostics installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the X-ray diodes, filters and mirrors and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determined flux using a Monte-Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.

  1. Time Series in Education: The Analysis of Daily Attendance in Two High Schools

    Science.gov (United States)

    Koopmans, Matthijs

    2011-01-01

    This presentation discusses the use of a time series approach to the analysis of daily attendance in two urban high schools over the course of one school year (2009-10). After establishing that the series for both schools were stationary, they were examined for moving average processes, autoregression, seasonal dependencies (weekly cycles),…

  2. Mapping air temperature using time series analysis of LST : The SINTESI approach

    NARCIS (Netherlands)

    Alfieri, S.M.; De Lorenzi, F.; Menenti, M.

    2013-01-01

    This paper presents a new procedure to map time series of air temperature (Ta) at fine spatial resolution using time series analysis of satellite-derived land surface temperature (LST) observations. The method assumes that air temperature is known at a single (reference) location such as in gridded

  3. Time-series analysis of Nigeria rice supply and demand: Error ...

    African Journals Online (AJOL)

    The study examined a time-series analysis of Nigeria rice supply and demand with a view to determining any long-run equilibrium between them using the Error Correction Model approach (ECM). The data used for the study represents the annual series of 1960-2007 (47 years) for rice supply and demand in Nigeria, ...

  4. Taxation in Public Education. Analysis and Bibliography Series, No. 12.

    Science.gov (United States)

    Ross, Larry L.

    Intended for both researchers and practitioners, this analysis and bibliography cites approximately 100 publications on educational taxation, including general texts and reports, statistical reports, taxation guidelines, and alternative proposals for taxation. Topics covered in the analysis section include State and Federal aid, urban and suburban…

  5. Hybrid analysis for indicating patients with breast cancer using temperature time series.

    Science.gov (United States)

    Silva, Lincoln F; Santos, Alair Augusto S M D; Bravo, Renato S; Silva, Aristófanes C; Muchaluat-Saade, Débora C; Conci, Aura

    2016-07-01

    Breast cancer is the most common cancer among women worldwide. Diagnosis and treatment in early stages increase cure chances. The temperature of cancerous tissue is generally higher than that of healthy surrounding tissues, making thermography an option to be considered in screening strategies of this cancer type. This paper proposes a hybrid methodology for analyzing dynamic infrared thermography in order to indicate patients with risk of breast cancer, using unsupervised and supervised machine learning techniques, which characterizes the methodology as hybrid. The dynamic infrared thermography monitors or quantitatively measures temperature changes on the examined surface, after a thermal stress. In the dynamic infrared thermography execution, a sequence of breast thermograms is generated. In the proposed methodology, this sequence is processed and analyzed by several techniques. First, the region of the breasts is segmented and the thermograms of the sequence are registered. Then, temperature time series are built and the k-means algorithm is applied on these series using various values of k. Clustering formed by k-means algorithm, for each k value, is evaluated using clustering validation indices, generating values treated as features in the classification model construction step. A data mining tool was used to solve the combined algorithm selection and hyperparameter optimization (CASH) problem in classification tasks. Besides the classification algorithm recommended by the data mining tool, classifiers based on Bayesian networks, neural networks, decision rules and decision tree were executed on the data set used for evaluation. Test results support that the proposed analysis methodology is able to indicate patients with breast cancer. Among 39 tested classification algorithms, K-Star and Bayes Net presented 100% classification accuracy. Furthermore, among the Bayes Net, multi-layer perceptron, decision table and random forest classification algorithms, an

  6. Reliability analysis techniques for the design engineer

    International Nuclear Information System (INIS)

    Corran, E.R.; Witt, H.H.

    1980-01-01

    A fault tree analysis package is described that eliminates most of the housekeeping tasks involved in proceeding from the initial construction of a fault tree to the final stage of presenting a reliability analysis in a safety report. It is suitable for designers with relatively little training in reliability analysis and computer operation. Users can rapidly investigate the reliability implications of various options at the design stage, and evolve a system which meets specified reliability objectives. Later independent review is thus unlikely to reveal major shortcomings necessitating modification and projects delays. The package operates interactively allowing the user to concentrate on the creative task of developing the system fault tree, which may be modified and displayed graphically. For preliminary analysis system data can be derived automatically from a generic data bank. As the analysis procedes improved estimates of critical failure rates and test and maintenance schedules can be inserted. The computations are standard, - identification of minimal cut-sets, estimation of reliability parameters, and ranking of the effect of the individual component failure modes and system failure modes on these parameters. The user can vary the fault trees and data on-line, and print selected data for preferred systems in a form suitable for inclusion in safety reports. A case history is given - that of HIFAR containment isolation system. (author)

  7. Nucelar reactor seismic safety analysis techniques

    International Nuclear Information System (INIS)

    Cummings, G.E.; Wells, J.E.; Lewis, L.C.

    1979-04-01

    In order to provide insights into the seismic safety requirements for nuclear power plants, a probabilistic based systems model and computational procedure have been developed. This model and computational procedure will be used to identify where data and modeling uncertainties need to be decreased by studying the effect of these uncertainties on the probability of radioactive release and the probability of failure of various structures, systems, and components. From the estimates of failure and release probabilities and their uncertainties the most sensitive steps in the seismic methodologies can be identified. In addition, the procedure will measure the uncertainty due to random occurrences, e.g. seismic event probabilities, material property variability, etc. The paper discusses the elements of this systems model and computational procedure, the event-tree/fault-tree development, and the statistical techniques to be employed

  8. Analysis of Jordanian Cigarettes Using XRF Techniques

    International Nuclear Information System (INIS)

    Kullab, M.; Ismail, A.; AL-kofahi, M.

    2002-01-01

    Sixteen brands of Jordanian cigarettes were analyzed using X-ray Fluorescence (XRF) techniques. These cigarettes were found to contain the elements: Si, S, Cl, K, Ca, P, Ti, Mn, Fe, Cu, Zn, Br.Rb and Sr. The major elements with concentrations of more than 1% by weight were Cl,K and Ca. The elements with minor concentrations, Between 0.1 and 1% by weight, were Si, S and P. The trace elements with concentrations below 0.1% by weight were Ti, Mn, Fe, Cu, Zn, Br, Rb and Sr. The toxicity of some trace elements, like Br, Rb, and Sr, which are present in some brands of Jordanian cigarettes, is discussed. (Author's) 24 refs., 1 tab., 1 fig

  9. Plate Fixation With Autogenous Calcaneal Dowel Grafting Proximal Fourth and Fifth Metatarsal Fractures: Technique and Case Series.

    Science.gov (United States)

    Seidenstricker, Chad L; Blahous, Edward G; Bouché, Richard T; Saxena, Amol

    Metaphyseal and proximal diaphyseal fractures of the lateral column metatarsals can have problems with healing. In particular, those involving the fifth metatarsal have been associated with a high nonunion rate with nonoperative treatment. Although intramedullary screw fixation results in a high union rate, delayed healing and complications can occur. We describe an innovative technique to treat both acute and chronic injuries involving the metatarsal base from the metaphysis to the proximal diaphyseal bone of the fourth and fifth metatarsals. The surgical technique involves evacuation of sclerotic bone at the fracture site, packing the fracture site with compact cancellous bone, and plate fixation. In our preliminary results, 4 patients displayed 100% radiographic union at a mean of 4.75 (range 4 to 6) weeks with no incidence of refracture, at a mean follow-up point of 3.5 (range 1 to 5) years. The early results with our small series suggest that this technique is a useful treatment choice for metaphyseal and proximal diaphyseal fractures of the fourth and fifth metatarsals. Copyright © 2017 American College of Foot and Ankle Surgeons. Published by Elsevier Inc. All rights reserved.

  10. Decentralized control using compositional analysis techniques

    NARCIS (Netherlands)

    Kerber, F.; van der Schaft, A. J.

    2011-01-01

    Decentralized control strategies aim at achieving a global control target by means of distributed local controllers acting on individual subsystems of the overall plant. In this sense, decentralized control is a dual problem to compositional analysis where a global verification task is decomposed

  11. Techniques and Applications of Urban Data Analysis

    KAUST Repository

    AlHalawani, Sawsan

    2016-01-01

    Digitization and characterization of urban spaces are essential components as we move to an ever-growing ’always connected’ world. Accurate analysis of such digital urban spaces has become more important as we continue to get spatial and social

  12. Evaluating Dynamic Analysis Techniques for Program Comprehension

    NARCIS (Netherlands)

    Cornelissen, S.G.M.

    2009-01-01

    Program comprehension is an essential part of software development and software maintenance, as software must be sufficiently understood before it can be properly modified. One of the common approaches in getting to understand a program is the study of its execution, also known as dynamic analysis.

  13. RECONSTRUCTION OF PRECIPITATION SERIES AND ANALYSIS OF CLIMATE CHANGE OVER PAST 500 YEARS IN NORTHERN CHINA

    Institute of Scientific and Technical Information of China (English)

    RONG Yan-shu; TU Qi-pu

    2005-01-01

    It is important and necessary to get a much longer precipitation series in order to research features of drought/flood and climate change.Based on dryness and wetness grades series of 18 stations in Northern China of 533 years from 1470 to 2002, the Moving Cumulative Frequency Method (MCFM) was developed, moving average precipitation series from 1499 to 2002 were reconstructed by testing three kinds of average precipitation, and the features of climate change and dry and wet periods were researched by using reconstructed precipitation series in the present paper.The results showed that there were good relationship between the reconstructed precipitation series and the observation precipitation series since 1954 and their relative root-mean-square error were below 1.89%, that the relation between reconstructed series and the dryness and wetness grades series were nonlinear and this nonlinear relation implied that reconstructed series were reliable and could became foundation data for researching evolution of the drought and flood.Analysis of climate change upon reconstructed precipitation series revealed that although drought intensity of recent dry period from middle 1970s of 20th century until early 21st century was not the strongest in historical climate of Northern China, intensity and duration of wet period was a great deal decreasing and shortening respectively, climate evolve to aridification situation in Northern China.

  14. Stock price forecasting based on time series analysis

    Science.gov (United States)

    Chi, Wan Le

    2018-05-01

    Using the historical stock price data to set up a sequence model to explain the intrinsic relationship of data, the future stock price can forecasted. The used models are auto-regressive model, moving-average model and autoregressive-movingaverage model. The original data sequence of unit root test was used to judge whether the original data sequence was stationary. The non-stationary original sequence as a first order difference needed further processing. Then the stability of the sequence difference was re-inspected. If it is still non-stationary, the second order differential processing of the sequence is carried out. Autocorrelation diagram and partial correlation diagram were used to evaluate the parameters of the identified ARMA model, including coefficients of the model and model order. Finally, the model was used to forecast the fitting of the shanghai composite index daily closing price with precision. Results showed that the non-stationary original data series was stationary after the second order difference. The forecast value of shanghai composite index daily closing price was closer to actual value, indicating that the ARMA model in the paper was a certain accuracy.

  15. A unified nonlinear stochastic time series analysis for climate science.

    Science.gov (United States)

    Moon, Woosok; Wettlaufer, John S

    2017-03-13

    Earth's orbit and axial tilt imprint a strong seasonal cycle on climatological data. Climate variability is typically viewed in terms of fluctuations in the seasonal cycle induced by higher frequency processes. We can interpret this as a competition between the orbitally enforced monthly stability and the fluctuations/noise induced by weather. Here we introduce a new time-series method that determines these contributions from monthly-averaged data. We find that the spatio-temporal distribution of the monthly stability and the magnitude of the noise reveal key fingerprints of several important climate phenomena, including the evolution of the Arctic sea ice cover, the El Nio Southern Oscillation (ENSO), the Atlantic Nio and the Indian Dipole Mode. In analogy with the classical destabilising influence of the ice-albedo feedback on summertime sea ice, we find that during some time interval of the season a destabilising process operates in all of these climate phenomena. The interaction between the destabilisation and the accumulation of noise, which we term the memory effect, underlies phase locking to the seasonal cycle and the statistical nature of seasonal predictability.

  16. Methodology Series Module 6: Systematic Reviews and Meta-analysis.

    Science.gov (United States)

    Setia, Maninder Singh

    2016-01-01

    Systematic reviews and meta-analysis have become an important of biomedical literature, and they provide the "highest level of evidence" for various clinical questions. There are a lot of studies - sometimes with contradictory conclusions - on a particular topic in literature. Hence, as a clinician, which results will you believe? What will you tell your patient? Which drug is better? A systematic review or a meta-analysis may help us answer these questions. In addition, it may also help us understand the quality of the articles in literature or the type of studies that have been conducted and published (example, randomized trials or observational studies). The first step it to identify a research question for systematic review or meta-analysis. The next step is to identify the articles that will be included in the study. This will be done by searching various databases; it is important that the researcher should search for articles in more than one database. It will also be useful to form a group of researchers and statisticians that have expertise in conducting systematic reviews and meta-analysis before initiating them. We strongly encourage the readers to register their proposed review/meta-analysis with PROSPERO. Finally, these studies should be reported according to the Preferred Reporting Items for Systematic Reviews and Meta-analysis checklist.

  17. 10th Australian conference on nuclear techniques of analysis. Proceedings

    International Nuclear Information System (INIS)

    1998-01-01

    These proceedings contains abstracts and extended abstracts of 80 lectures and posters presented at the 10th Australian conference on nuclear techniques of analysis hosted by the Australian National University in Canberra, Australia from 24-26 of November 1997. The conference was divided into sessions on the following topics : ion beam analysis and its applications; surface science; novel nuclear techniques of analysis, characterization of thin films, electronic and optoelectronic material formed by ion implantation, nanometre science and technology, plasma science and technology. A special session was dedicated to new nuclear techniques of analysis, future trends and developments. Separate abstracts were prepared for the individual presentation included in this volume

  18. 10th Australian conference on nuclear techniques of analysis. Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-06-01

    These proceedings contains abstracts and extended abstracts of 80 lectures and posters presented at the 10th Australian conference on nuclear techniques of analysis hosted by the Australian National University in Canberra, Australia from 24-26 of November 1997. The conference was divided into sessions on the following topics : ion beam analysis and its applications; surface science; novel nuclear techniques of analysis, characterization of thin films, electronic and optoelectronic material formed by ion implantation, nanometre science and technology, plasma science and technology. A special session was dedicated to new nuclear techniques of analysis, future trends and developments. Separate abstracts were prepared for the individual presentation included in this volume.

  19. A methodological comparison of customer service analysis techniques

    Science.gov (United States)

    James Absher; Alan Graefe; Robert Burns

    2003-01-01

    Techniques used to analyze customer service data need to be studied. Two primary analysis protocols, importance-performance analysis (IP) and gap score analysis (GA), are compared in a side-by-side comparison using data from two major customer service research projects. A central concern is what, if any, conclusion might be different due solely to the analysis...

  20. Evaluation of data reduction methods for dynamic PET series based on Monte Carlo techniques and the NCAT phantom

    International Nuclear Information System (INIS)

    Thireou, Trias; Rubio Guivernau, Jose Luis; Atlamazoglou, Vassilis; Ledesma, Maria Jesus; Pavlopoulos, Sotiris; Santos, Andres; Kontaxakis, George

    2006-01-01

    A realistic dynamic positron-emission tomography (PET) thoracic study was generated, using the 4D NURBS-based (non-uniform rational B-splines) cardiac-torso (NCAT) phantom and a sophisticated model of the PET imaging process, simulating two solitary pulmonary nodules. Three data reduction and blind source separation methods were applied to the simulated data: principal component analysis, independent component analysis and similarity mapping. All methods reduced the initial amount of image data to a smaller, comprehensive and easily managed set of parametric images, where structures were separated based on their different kinetic characteristics and the lesions were readily identified. The results indicate that the above-mentioned methods can provide an accurate tool for the support of both visual inspection and subsequent detailed kinetic analysis of the dynamic series via compartmental or non-compartmental models

  1. A novel water quality data analysis framework based on time-series data mining.

    Science.gov (United States)

    Deng, Weihui; Wang, Guoyin

    2017-07-01

    The rapid development of time-series data mining provides an emerging method for water resource management research. In this paper, based on the time-series data mining methodology, we propose a novel and general analysis framework for water quality time-series data. It consists of two parts: implementation components and common tasks of time-series data mining in water quality data. In the first part, we propose to granulate the time series into several two-dimensional normal clouds and calculate the similarities in the granulated level. On the basis of the similarity matrix, the similarity search, anomaly detection, and pattern discovery tasks in the water quality time-series instance dataset can be easily implemented in the second part. We present a case study of this analysis framework on weekly Dissolve Oxygen time-series data collected from five monitoring stations on the upper reaches of Yangtze River, China. It discovered the relationship of water quality in the mainstream and tributary as well as the main changing patterns of DO. The experimental results show that the proposed analysis framework is a feasible and efficient method to mine the hidden and valuable knowledge from water quality historical time-series data. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Nuclear techniques for analysis of environmental samples

    International Nuclear Information System (INIS)

    1986-12-01

    The main purposes of this meeting were to establish the state-of-the-art in the field, to identify new research and development that is required to provide an adequate framework for analysis of environmental samples and to assess needs and possibilities for international cooperation in problem areas. This technical report was prepared on the subject based on the contributions made by the participants. A separate abstract was prepared for each of the 9 papers

  3. Application of activation techniques to biological analysis

    International Nuclear Information System (INIS)

    Bowen, H.J.M.

    1981-01-01

    Applications of activation analysis in the biological sciences are reviewed for the period of 1970 to 1979. The stages and characteristics of activation analysis are described, and its advantages and disadvantages enumerated. Most applications involve activation by thermal neutrons followed by either radiochemical or instrumental determination. Relatively little use has been made of activation by fast neutrons, photons, or charged particles. In vivo analyses are included, but those based on prompt gamma or x-ray emission are not. Major applications include studies of reference materials, and the elemental analysis of plants, marine biota, animal and human tissues, diets, and excreta. Relatively little use of it has been made in biochemistry, microbiology, and entomology, but it has become important in toxicology and environmental science. The elements most often determined are Ag, As, Au, Br, Ca, Cd, Cl, Co, Cr, Cs, Cu, Fe, Hg, I, K, Mn, Mo, Na, Rb, Sb, Sc, Se, and Zn, while few or no determinations of B, Be, Bi, Ga, Gd, Ge, H, In, Ir, Li, Nd, Os, Pd, Pr, Pt, Re, Rh, Ru, Te, Tl, or Y have been made in biological materials

  4. Using Computer Techniques To Predict OPEC Oil Prices For Period 2000 To 2015 By Time-Series Methods

    Directory of Open Access Journals (Sweden)

    Mohammad Esmail Ahmad

    2015-08-01

    Full Text Available The instability in the world and OPEC oil process results from many factors through a long time. The problems can be summarized as that the oil exports dont constitute a large share of N.I. only but it also makes up most of the saving of the oil states. The oil prices affect their market through the interaction of supply and demand forces of oil. The research hypothesis states that the movement of oil prices caused shocks crises and economic problems. These shocks happen due to changes in oil prices need to make a prediction within the framework of economic planning in a short run period in order to avoid shocks through using computer techniques by time series models.

  5. Time series analysis of diverse extreme phenomena: universal features

    Science.gov (United States)

    Eftaxias, K.; Balasis, G.

    2012-04-01

    The field of study of complex systems holds that the dynamics of complex systems are founded on universal principles that may used to describe a great variety of scientific and technological approaches of different types of natural, artificial, and social systems. We suggest that earthquake, epileptic seizures, solar flares, and magnetic storms dynamics can be analyzed within similar mathematical frameworks. A central property of aforementioned extreme events generation is the occurrence of coherent large-scale collective behavior with very rich structure, resulting from repeated nonlinear interactions among the corresponding constituents. Consequently, we apply the Tsallis nonextensive statistical mechanics as it proves an appropriate framework in order to investigate universal principles of their generation. First, we examine the data in terms of Tsallis entropy aiming to discover common "pathological" symptoms of transition to a significant shock. By monitoring the temporal evolution of the degree of organization in time series we observe similar distinctive features revealing significant reduction of complexity during their emergence. Second, a model for earthquake dynamics coming from a nonextensive Tsallis formalism, starting from first principles, has been recently introduced. This approach leads to an energy distribution function (Gutenberg-Richter type law) for the magnitude distribution of earthquakes, providing an excellent fit to seismicities generated in various large geographic areas usually identified as seismic regions. We show that this function is able to describe the energy distribution (with similar non-extensive q-parameter) of solar flares, magnetic storms, epileptic and earthquake shocks. The above mentioned evidence of a universal statistical behavior suggests the possibility of a common approach for studying space weather, earthquakes and epileptic seizures.

  6. Real analysis series, functions of several variables, and applications

    CERN Document Server

    Laczkovich, Miklós

    2017-01-01

    This book develops the theory of multivariable analysis, building on the single variable foundations established in the companion volume, Real Analysis: Foundations and Functions of One Variable. Together, these volumes form the first English edition of the popular Hungarian original, Valós Analízis I & II, based on courses taught by the authors at Eötvös Loránd University, Hungary, for more than 30 years. Numerous exercises are included throughout, offering ample opportunities to master topics by progressing from routine to difficult problems. Hints or solutions to many of the more challenging exercises make this book ideal for independent study, or further reading. Intended as a sequel to a course in single variable analysis, this book builds upon and expands these ideas into higher dimensions. The modular organization makes this text adaptable for either a semester or year-long introductory course. Topics include: differentiation and integration of functions of several variables; infinite numerica...

  7. Analysis of engineering cycles thermodynamics and fluid mechanics series

    CERN Document Server

    Haywood, R W

    1980-01-01

    Analysis of Engineering Cycles, Third Edition, deals principally with an analysis of the overall performance, under design conditions, of work-producing power plants and work-absorbing refrigerating and gas-liquefaction plants, most of which are either cyclic or closely related thereto. The book is organized into two parts, dealing first with simple power and refrigerating plants and then moving on to more complex plants. The principal modifications in this Third Edition arise from the updating and expansion of material on nuclear plants and on combined and binary plants. In view of increased

  8. New analytical techniques for cuticle chemical analysis

    International Nuclear Information System (INIS)

    Schulten, H.R.

    1994-01-01

    1) The analytical methodology of pyrolysis-gas chromatography/mass spectrometry (Py-GC/MS) and direct pyrolysis-mass spectrometry (Py-MS) using soft ionization techniques by high electric fields (FL) are briefly described. Recent advances of Py-GC/MS and Py-FIMS for the analyses of complex organic matter such as plant materials, humic substances, dissolved organic matter in water (DOM) and soil organic matter (SOM) in agricultural and forest soils are given to illustrate the potential and limitations of the applied methods. 2) Novel applications of Py-GC/MS and Py-MS in combination with conventional analytical data in an integrated, chemometric approach to investigate the dynamics of plant lipids are reported. This includes multivariate statistical investigations on maturation, senescence, humus genesis, and environmental damages in spruce ecosystems. 3) The focal point is the author's integrated investigations on emission-induced changes of selected conifer plant constituents. Pattern recognition of Py-MS data of desiccated spruce needles provides a method for distinguishing needles damaged in different ways and determining the cause. Spruce needles were collected from both controls and trees treated with sulphur dioxide (acid rain), nitrogen dioxide, and ozone under controlled conditions. Py-MS and chemometric data evaluation are employed to characterize and classify leaves and their epicuticular waxes. Preliminary mass spectrometric evaluations of isolated cuticles of different plants such as spruce, ivy, holly, and philodendron, as well as ivy cuticles treated in vivo with air pollutants such as surfactants and pesticides are given. (orig.)

  9. Probabilistic risk assessment course documentation. Volume 5. System reliability and analysis techniques Session D - quantification

    International Nuclear Information System (INIS)

    Lofgren, E.V.

    1985-08-01

    This course in System Reliability and Analysis Techniques focuses on the probabilistic quantification of accident sequences and the link between accident sequences and consequences. Other sessions in this series focus on the quantification of system reliability and the development of event trees and fault trees. This course takes the viewpoint that event tree sequences or combinations of system failures and success are available and that Boolean equations for system fault trees have been developed and are available. 93 figs., 11 tabs

  10. A technique for human error analysis (ATHEANA)

    Energy Technology Data Exchange (ETDEWEB)

    Cooper, S.E.; Ramey-Smith, A.M.; Wreathall, J.; Parry, G.W. [and others

    1996-05-01

    Probabilistic risk assessment (PRA) has become an important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. Human reliability analysis (HRA) is a critical element of PRA; however, limitations in the analysis of human actions in PRAs have long been recognized as a constraint when using PRA. A multidisciplinary HRA framework has been developed with the objective of providing a structured approach for analyzing operating experience and understanding nuclear plant safety, human error, and the underlying factors that affect them. The concepts of the framework have matured into a rudimentary working HRA method. A trial application of the method has demonstrated that it is possible to identify potentially significant human failure events from actual operating experience which are not generally included in current PRAs, as well as to identify associated performance shaping factors and plant conditions that have an observable impact on the frequency of core damage. A general process was developed, albeit in preliminary form, that addresses the iterative steps of defining human failure events and estimating their probabilities using search schemes. Additionally, a knowledge- base was developed which describes the links between performance shaping factors and resulting unsafe actions.

  11. A technique for human error analysis (ATHEANA)

    International Nuclear Information System (INIS)

    Cooper, S.E.; Ramey-Smith, A.M.; Wreathall, J.; Parry, G.W.

    1996-05-01

    Probabilistic risk assessment (PRA) has become an important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. Human reliability analysis (HRA) is a critical element of PRA; however, limitations in the analysis of human actions in PRAs have long been recognized as a constraint when using PRA. A multidisciplinary HRA framework has been developed with the objective of providing a structured approach for analyzing operating experience and understanding nuclear plant safety, human error, and the underlying factors that affect them. The concepts of the framework have matured into a rudimentary working HRA method. A trial application of the method has demonstrated that it is possible to identify potentially significant human failure events from actual operating experience which are not generally included in current PRAs, as well as to identify associated performance shaping factors and plant conditions that have an observable impact on the frequency of core damage. A general process was developed, albeit in preliminary form, that addresses the iterative steps of defining human failure events and estimating their probabilities using search schemes. Additionally, a knowledge- base was developed which describes the links between performance shaping factors and resulting unsafe actions

  12. Impact of novel techniques on minimally invasive adrenal surgery: trends and outcomes from a contemporary international large series in urology.

    Science.gov (United States)

    Pavan, Nicola; Autorino, Riccardo; Lee, Hak; Porpiglia, Francesco; Sun, Yinghao; Greco, Francesco; Jeff Chueh, S; Han, Deok Hyun; Cindolo, Luca; Ferro, Matteo; Chen, Xiang; Branco, Anibal; Fornara, Paolo; Liao, Chun-Hou; Miyajima, Akira; Kyriazis, Iason; Puglisi, Marco; Fiori, Cristian; Yang, Bo; Fei, Guo; Altieri, Vincenzo; Jeong, Byong Chang; Berardinelli, Francesco; Schips, Luigi; De Cobelli, Ottavio; Chen, Zhi; Haber, Georges-Pascal; He, Yao; Oya, Mototsugu; Liatsikos, Evangelos; Brandao, Luis; Challacombe, Benjamin; Kaouk, Jihad; Darweesh, Ithaar

    2016-10-01

    To evaluate contemporary international trends in the implementation of minimally invasive adrenalectomy and to assess contemporary outcomes of different minimally invasive techniques performed at urologic centers worldwide. A retrospective multinational multicenter study of patients who underwent minimally invasive adrenalectomy from 2008 to 2013 at 14 urology institutions worldwide was included in the analysis. Cases were categorized based on the minimally invasive adrenalectomy technique: conventional laparoscopy (CL), robot-assisted laparoscopy (RAL), laparoendoscopic single-site surgery (LESS), and mini-laparoscopy (ML). The rates of the four treatment modalities were determined according to the year of surgery, and a regression analysis was performed for trends in all surgical modalities. Overall, a total of 737 adrenalectomies were performed across participating institutions and included in this analysis: 337 CL (46 % of cases), 57 ML (8 %), 263 LESS (36 %), and 80 RA (11 %). Overall, 204 (28 %) operations were performed with a retroperitoneal approach. The overall number of adrenalectomies increased from 2008 to 2013 (p = 0.05). A transperitoneal approach was preferred in all but the ML group (p Asia and South America reported the highest rate in LESS procedures, and RAL was adopted to larger extent in the USA. LESS had the fastest increase in utilization at 6 %/year. The rate of RAL procedures increased at slower rates (2.2 %/year), similar to ML (1.7 %/year). Limitations of this study are the retrospective design and the lack of a cost analysis. Several minimally invasive surgical techniques for the management of adrenal masses are successfully implemented in urology institutions worldwide. CL and LESS seem to represent the most commonly adopted techniques, whereas ML and RAL are growing at a slower rate. All the MIS techniques can be safely and effectively performed for a variety of adrenal disease.

  13. Time series analysis of aerobic bacterial flora during Miso fermentation.

    Science.gov (United States)

    Onda, T; Yanagida, F; Tsuji, M; Shinohara, T; Yokotsuka, K

    2003-01-01

    This article reports a microbiological study of aerobic mesophilic bacteria that are present during the fermentation process of Miso. Aerobic bacteria were enumerated and isolated from Miso during fermentation and divided into nine groups using traditional phenotypic tests. The strains were identified by biochemical analysis and 16S rRNA sequence analysis. They were identified as Bacillus subtilis, B. amyloliquefaciens, Kocuria kristinae, Staphylococcus gallinarum and S. kloosii. All strains were sensitive to the bacteriocins produced by the lactic acid bacteria isolated from Miso. The dominant species among the undesirable species throughout the fermentation process were B. subtilis and B. amyloliquefaciens. It is suggested that bacteriocin-producing lactic acid bacteria are effective in the growth prevention of aerobic bacteria in Miso. This study has provided useful information for controlling of bacterial flora during Miso fermentation.

  14. Reconstruction of large diaphyseal bone defect by simplified bone transport over nail technique: A 7-case series.

    Science.gov (United States)

    Ferchaud, F; Rony, L; Ducellier, F; Cronier, P; Steiger, V; Hubert, L

    2017-11-01

    Reconstruction of large diaphyseal bone defect is complex and the complications rate is high. This study aimed to assess a simplified technique of segmental bone transport by monorail external fixator over an intramedullary nail.A prospective study included 7 patients: 2 femoral and 5 tibial defects. Mean age was 31years (range: 16-61years). Mean follow-up was 62 months (range: 46-84months). Defects were post-traumatic, with a mean length of 7.2cm (range: 4 to 9.5cm). For 3 patients, reconstruction followed primary failure. In 4 cases, a covering flap was necessary. Transport used an external fixator guided by an intramedullary nail, at a rate of 1mm per day. One pin was implanted on either side of the distraction zone. The external fixator was removed 1 month after bone contact at the docking site. Mean bone transport time was 11 weeks (range: 7-15 weeks). Mean external fixation time was 5.1months (range: 3.5 to 8months). Full weight-bearing was allowed 5.7months (range: 3.5-13months) after initiation of transport. In one patient, a pin had to be repositioned. In 3 patients, the transported segment re-ascended after external fixatorablation, requiring repeat external fixation and resumption of transport. There was just 1 case of superficial pin infection. Reconstruction quality was considered "excellent" on the Paley-Marr criteria in 6 cases. The present technique provided excellent reconstruction quality in 6 of the 7 cases. External fixation time was shorter and resumption of weight-bearing earlier than with other reconstruction techniques, notably including bone autograft, vascularized bone graft or the induced membrane technique. Nailing facilitated control of limb axis and length. The complications rate was 50%, comparable to other techniques. This study raises the question of systematic internal fixation of the docking site, to avoid any mobilization of the transported segment. The bone quality, axial control and rapidity shown by the present technique make

  15. Development of chemical analysis techniques: pt. 3

    International Nuclear Information System (INIS)

    Kim, K.J.; Chi, K.Y.; Choi, G.C.

    1981-01-01

    For the purpose of determining trace rare earths a spectrofluorimetric method has been studied. Except Ce and Tb, the fluorescence intensities are not enough to allow satisfactory analysis. Complexing agents such as tungstate and hexafluoroacetylacetone should be employed to increase fluorescence intensities. As a preliminary experiment for the separation of individual rare earth element and uranium, the distribution coefficient, % S here, are obtained on the Dowex 50 W against HCl concentration by a batch method. These % S data are utilized to obtain elution curves. The % S data showed a minimum at around 4 M HCl. To understand this previously known phenomenon the adsorption of Cl - on Dowex 50 W is examined as a function of HCl concentration and found to be decreasing while % S of rare earths increasing. It is interpreted that Cl - and rare earth ions are moved into the resin phase separately and that the charge and the charge densities of these ions are responsible for the different % S curves. Dehydration appears to play an important role in the upturn of the % S curves at higher HCl concentrations

  16. Cluster analysis of activity-time series in motor learning

    DEFF Research Database (Denmark)

    Balslev, Daniela; Nielsen, Finn Årup; Frutiger, Sally A.

    2002-01-01

    Neuroimaging studies of learning focus on brain areas where the activity changes as a function of time. To circumvent the difficult problem of model selection, we used a data-driven analytic tool, cluster analysis, which extracts representative temporal and spatial patterns from the voxel...... practice-related activity in a fronto-parieto-cerebellar network, in agreement with previous studies of motor learning. These voxels were separated from a group of voxels showing an unspecific time-effect and another group of voxels, whose activation was an artifact from smoothing. Hum. Brain Mapping 15...

  17. Series Resistance Analysis of Passivated Emitter Rear Contact Cells Patterned Using Inkjet Printing

    Directory of Open Access Journals (Sweden)

    Martha A. T. Lenio

    2012-01-01

    Full Text Available For higher-efficiency solar cell structures, such as the Passivated Emitter Rear Contact (PERC cells, to be fabricated in a manufacturing environment, potentially low-cost techniques such as inkjet printing and metal plating are desirable. A common problem that is experienced when fabricating PERC cells is low fill factors due to high series resistance. This paper identifies and attempts to quantify sources of series resistance in inkjet-patterned PERC cells that employ electroless or light-induced nickel-plating techniques followed by copper light-induced plating. Photoluminescence imaging is used to determine locations of series resistance losses in these inkjet-patterned and plated PERC cells.

  18. Cointegration and Error Correction Modelling in Time-Series Analysis: A Brief Introduction

    Directory of Open Access Journals (Sweden)

    Helmut Thome

    2015-07-01

    Full Text Available Criminological research is often based on time-series data showing some type of trend movement. Trending time-series may correlate strongly even in cases where no causal relationship exists (spurious causality. To avoid this problem researchers often apply some technique of detrending their data, such as by differencing the series. This approach, however, may bring up another problem: that of spurious non-causality. Both problems can, in principle, be avoided if the series under investigation are “difference-stationary” (if the trend movements are stochastic and “cointegrated” (if the stochastically changing trendmovements in different variables correspond to each other. The article gives a brief introduction to key instruments and interpretative tools applied in cointegration modelling.

  19. Statistical analysis of yearly series of maximum daily rainfall in Spain. Analisis estadistico de las series anuales de maximas lluvias diarias en Espaa

    Energy Technology Data Exchange (ETDEWEB)

    Ferrer Polo, J.; Ardiles Lopez, K. L. (CEDEX, Ministerio de Obras Publicas, Transportes y Medio ambiente, Madrid (Spain))

    1994-01-01

    Work on the statistical modelling of maximum daily rainfalls is presented, with a view to estimating the quantiles for different return periods. An index flood approach has been adopted in which the local quantiles are a result of rescaling a regional law using the mean of each series of values, that is utilized as a local scale factor. The annual maximum series have been taken from 1.545 meteorological stations over a 30 year period, and these have been classified into 26 regions defined according to meteorological criteria, the homogeneity of wich has been checked by means of a statistical analysis of the coefficients of variation of the samples,using the. An estimation has been made of the parameters for the following four distribution models: Two Component Extreme Value (TCEV); General Extreme Value (GEV); Log-Pearson III (LP3); and SQRT-Exponential Type Distribution of Maximum. The analysis of the quantiles obtained reveals slight differences in the results thus detracting from the importance of the model selection. The last of the above-mentioned distribution has been finally chosen, on the basis of the following: it is defined with fewer parameters it is the only that was proposed specifically for the analysis of daily rainfall maximums; it yields more conservative results than the traditional Gumbel distribution for the high return periods; and it is capable of providing a good description of the main sampling statistics concerning the right-hand tail of the distribution, a fact that has been checked with Montecarlo's simulation techniques. The choice of a distribution model with only two parameters has led to the selection of the regional coefficient of variation as the only determining parameter for the regional quantiles. This has permitted the elimination of the quantiles discontinuity of the classical regional approach, thus smoothing the values of that coefficient by means of an isoline plan on a national scale.

  20. Comparative study between the PIXE technique and neutron activation analysis for Zinc determination

    International Nuclear Information System (INIS)

    Cruvinel, Paulo Estevao; Crestana, Silvio; Artaxo Netto, Paulo Eduardo

    1997-01-01

    This work presents a comparative study between the PIXE, proton beams and neutron activation analysis (NAA) techniques, for determination of total zinc concentration. Particularly, soil samples from the Pindorama, Instituto Agronomico de Campinas, Sao Paulo State, Brazil, experimental station have been analysed and measuring the zinc contents in μg/g. The results presented good correlation between the mentioned techniques. The PIXE and NAA analyses have been carried out by using the series S, 2.4 MeV proton beams Pelletron accelerator and the IPEN/CNEN-IEA-R1 reactor, both installed at the Sao Paulo - Brazil university

  1. A Smoothing Technique for the Multifractal Analysis of a Medium Voltage Feeders Electric Current

    Science.gov (United States)

    de Santis, Enrico; Sadeghian, Alireza; Rizzi, Antonello

    2017-12-01

    The current paper presents a data-driven detrending technique allowing to smooth complex sinusoidal trends from a real-world electric load time series before applying the Detrended Multifractal Fluctuation Analysis (MFDFA). The algorithm we call Smoothed Sort and Cut Fourier Detrending (SSC-FD) is based on a suitable smoothing of high power periodicities operating directly in the Fourier spectrum through a polynomial fitting technique of the DFT. The main aim consists of disambiguating the characteristic slow varying periodicities, that can impair the MFDFA analysis, from the residual signal in order to study its correlation properties. The algorithm performances are evaluated on a simple benchmark test consisting of a persistent series where the Hurst exponent is known, with superimposed ten sinusoidal harmonics. Moreover, the behavior of the algorithm parameters is assessed computing the MFDFA on the well-known sunspot data, whose correlation characteristics are reported in literature. In both cases, the SSC-FD method eliminates the apparent crossover induced by the synthetic and natural periodicities. Results are compared with some existing detrending methods within the MFDFA paradigm. Finally, a study of the multifractal characteristics of the electric load time series detrendended by the SSC-FD algorithm is provided, showing a strong persistent behavior and an appreciable amplitude of the multifractal spectrum that allows to conclude that the series at hand has multifractal characteristics.

  2. Multifractal detrended cross-correlation analysis on gold, crude oil and foreign exchange rate time series

    Science.gov (United States)

    Pal, Mayukha; Madhusudana Rao, P.; Manimaran, P.

    2014-12-01

    We apply the recently developed multifractal detrended cross-correlation analysis method to investigate the cross-correlation behavior and fractal nature between two non-stationary time series. We analyze the daily return price of gold, West Texas Intermediate and Brent crude oil, foreign exchange rate data, over a period of 18 years. The cross correlation has been measured from the Hurst scaling exponents and the singularity spectrum quantitatively. From the results, the existence of multifractal cross-correlation between all of these time series is found. We also found that the cross correlation between gold and oil prices possess uncorrelated behavior and the remaining bivariate time series possess persistent behavior. It was observed for five bivariate series that the cross-correlation exponents are less than the calculated average generalized Hurst exponents (GHE) for q0 and for one bivariate series the cross-correlation exponent is greater than GHE for all q values.

  3. Cerebral venous sinus thrombosis on MRI: A case series analysis

    Directory of Open Access Journals (Sweden)

    Sanjay M Khaladkar

    2014-01-01

    Full Text Available Background: Cerebral venous sinus thrombosis (CVST is a rare form of stroke seen in young and middle aged group, especially in women due to thrombus of dural venous sinuses and can cause acute neurological deterioration with increased morbidity and mortality if not diagnosed in early stage. Neurological deficit occurs due to focal or diffuse cerebral edema and venous non-hemorrhagic or hemorrhagic infarct. Aim and Objectives: To assess/evaluate the role of Magnetic Resonance Imaging (MRI and Magnetic Resonance Venography (MRV as an imaging modality for early diagnosis of CVST and to study patterns of venous thrombosis, in detecting changes in brain parenchyma and residual effects of CVST using MRI. Materials and Methods: Retrospective descriptive analysis of 40 patients of CVST diagnosed on MRI brain and MRV was done. Results: 29/40 (72.5% were males and 11/40 (27.5% were females. Most of the patients were in the age group of 21-40 years (23/40-57.5%. Most of the patients 16/40 (40% presented within 7 days. No definite cause of CVST was found in 24 (60% patients in spite of detailed history. In 36/40 (90% of cases major sinuses were involved, deep venous system were involved in 7/40 (17.5% cases, superficial cortical vein was involved in 1/40 (2.5% cases. Analysis of stage of thrombus (acute, subacute, chronic was done based on its appearance on T1 and T2WI. 31/40 (77.5% patients showed complete absence of flow on MRV, while 9/40 (22.5% cases showed partial flow on MR venogram. Brain parenchyma was normal in 20/40 (50% patients while 6/40 (15% cases had non-hemorrhagic infarct and 14/40 (35% patients presented with hemorrhagic infarct. Conclusion: Our study concluded that MRI brain with MRV is sensitive in diagnosing both direct signs (evidence of thrombus inside the affected veins and indirect signs (parenchymal changes of CVST and their follow up.

  4. Search for the top quark using multivariate analysis techniques

    International Nuclear Information System (INIS)

    Bhat, P.C.

    1994-08-01

    The D0 collaboration is developing top search strategies using multivariate analysis techniques. We report here on applications of the H-matrix method to the eμ channel and neural networks to the e+jets channel

  5. Multichannel biomedical time series clustering via hierarchical probabilistic latent semantic analysis.

    Science.gov (United States)

    Wang, Jin; Sun, Xiangping; Nahavandi, Saeid; Kouzani, Abbas; Wu, Yuchuan; She, Mary

    2014-11-01

    Biomedical time series clustering that automatically groups a collection of time series according to their internal similarity is of importance for medical record management and inspection such as bio-signals archiving and retrieval. In this paper, a novel framework that automatically groups a set of unlabelled multichannel biomedical time series according to their internal structural similarity is proposed. Specifically, we treat a multichannel biomedical time series as a document and extract local segments from the time series as words. We extend a topic model, i.e., the Hierarchical probabilistic Latent Semantic Analysis (H-pLSA), which was originally developed for visual motion analysis to cluster a set of unlabelled multichannel time series. The H-pLSA models each channel of the multichannel time series using a local pLSA in the first layer. The topics learned in the local pLSA are then fed to a global pLSA in the second layer to discover the categories of multichannel time series. Experiments on a dataset extracted from multichannel Electrocardiography (ECG) signals demonstrate that the proposed method performs better than previous state-of-the-art approaches and is relatively robust to the variations of parameters including length of local segments and dictionary size. Although the experimental evaluation used the multichannel ECG signals in a biometric scenario, the proposed algorithm is a universal framework for multichannel biomedical time series clustering according to their structural similarity, which has many applications in biomedical time series management. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  6. Principal component analysis of MSBAS DInSAR time series from Campi Flegrei, Italy

    Science.gov (United States)

    Tiampo, Kristy F.; González, Pablo J.; Samsonov, Sergey; Fernández, Jose; Camacho, Antonio

    2017-09-01

    Because of its proximity to the city of Naples and with a population of nearly 1 million people within its caldera, Campi Flegrei is one of the highest risk volcanic areas in the world. Since the last major eruption in 1538, the caldera has undergone frequent episodes of ground subsidence and uplift accompanied by seismic activity that has been interpreted as the result of a stationary, deeper source below the caldera that feeds shallower eruptions. However, the location and depth of the deeper source is not well-characterized and its relationship to current activity is poorly understood. Recently, a significant increase in the uplift rate has occurred, resulting in almost 13 cm of uplift by 2013 (De Martino et al., 2014; Samsonov et al., 2014b; Di Vito et al., 2016). Here we apply a principal component decomposition to high resolution time series from the region produced by the advanced Multidimensional SBAS DInSAR technique in order to better delineate both the deeper source and the recent shallow activity. We analyzed both a period of substantial subsidence (1993-1999) and a second of significant uplift (2007-2013) and inverted the associated vertical surface displacement for the most likely source models. Results suggest that the underlying dynamics of the caldera changed in the late 1990s, from one in which the primary signal arises from a shallow deflating source above a deeper, expanding source to one dominated by a shallow inflating source. In general, the shallow source lies between 2700 and 3400 m below the caldera while the deeper source lies at 7600 m or more in depth. The combination of principal component analysis with high resolution MSBAS time series data allows for these new insights and confirms the applicability of both to areas at risk from dynamic natural hazards.

  7. Neutron activation analysis: an emerging technique for conservation/preservation

    International Nuclear Information System (INIS)

    Sayre, E.V.

    1976-01-01

    The diverse applications of neutron activation in analysis, preservation, and documentation of art works and artifacts are described with illustrations for each application. The uses of this technique to solve problems of attribution and authentication, to reveal the inner structure and composition of art objects, and, in some instances to recreate details of the objects are described. A brief discussion of the theory and techniques of neutron activation analysis is also included

  8. Development of evaluation method for software safety analysis techniques

    International Nuclear Information System (INIS)

    Huang, H.; Tu, W.; Shih, C.; Chen, C.; Yang, W.; Yih, S.; Kuo, C.; Chen, M.

    2006-01-01

    Full text: Full text: Following the massive adoption of digital Instrumentation and Control (I and C) system for nuclear power plant (NPP), various Software Safety Analysis (SSA) techniques are used to evaluate the NPP safety for adopting appropriate digital I and C system, and then to reduce risk to acceptable level. However, each technique has its specific advantage and disadvantage. If the two or more techniques can be complementarily incorporated, the SSA combination would be more acceptable. As a result, if proper evaluation criteria are available, the analyst can then choose appropriate technique combination to perform analysis on the basis of resources. This research evaluated the applicable software safety analysis techniques nowadays, such as, Preliminary Hazard Analysis (PHA), Failure Modes and Effects Analysis (FMEA), Fault Tree Analysis (FTA), Markov chain modeling, Dynamic Flowgraph Methodology (DFM), and simulation-based model analysis; and then determined indexes in view of their characteristics, which include dynamic capability, completeness, achievability, detail, signal/ noise ratio, complexity, and implementation cost. These indexes may help the decision makers and the software safety analysts to choose the best SSA combination arrange their own software safety plan. By this proposed method, the analysts can evaluate various SSA combinations for specific purpose. According to the case study results, the traditional PHA + FMEA + FTA (with failure rate) + Markov chain modeling (without transfer rate) combination is not competitive due to the dilemma for obtaining acceptable software failure rates. However, the systematic architecture of FTA and Markov chain modeling is still valuable for realizing the software fault structure. The system centric techniques, such as DFM and Simulation-based model analysis, show the advantage on dynamic capability, achievability, detail, signal/noise ratio. However, their disadvantage are the completeness complexity

  9. A new wind speed forecasting strategy based on the chaotic time series modelling technique and the Apriori algorithm

    International Nuclear Information System (INIS)

    Guo, Zhenhai; Chi, Dezhong; Wu, Jie; Zhang, Wenyu

    2014-01-01

    Highlights: • Impact of meteorological factors on wind speed forecasting is taken into account. • Forecasted wind speed results are corrected by the associated rules. • Forecasting accuracy is improved by the new wind speed forecasting strategy. • Robust of the proposed model is validated by data sampled from different sites. - Abstract: Wind energy has been the fastest growing renewable energy resource in recent years. Because of the intermittent nature of wind, wind power is a fluctuating source of electrical energy. Therefore, to minimize the impact of wind power on the electrical grid, accurate and reliable wind power forecasting is mandatory. In this paper, a new wind speed forecasting approach based on based on the chaotic time series modelling technique and the Apriori algorithm has been developed. The new approach consists of four procedures: (I) Clustering by using the k-means clustering approach; (II) Employing the Apriori algorithm to discover the association rules; (III) Forecasting the wind speed according to the chaotic time series forecasting model; and (IV) Correcting the forecasted wind speed data using the associated rules discovered previously. This procedure has been verified by 31-day-ahead daily average wind speed forecasting case studies, which employed the wind speed and other meteorological data collected from four meteorological stations located in the Hexi Corridor area of China. The results of these case studies reveal that the chaotic forecasting model can efficiently improve the accuracy of the wind speed forecasting, and the Apriori algorithm can effectively discover the association rules between the wind speed and other meteorological factors. In addition, the correction results demonstrate that the association rules discovered by the Apriori algorithm have powerful capacities in handling the forecasted wind speed values correction when the forecasted values do not match the classification discovered by the association rules

  10. Bioelectric signal classification using a recurrent probabilistic neural network with time-series discriminant component analysis.

    Science.gov (United States)

    Hayashi, Hideaki; Shima, Keisuke; Shibanoki, Taro; Kurita, Yuichi; Tsuji, Toshio

    2013-01-01

    This paper outlines a probabilistic neural network developed on the basis of time-series discriminant component analysis (TSDCA) that can be used to classify high-dimensional time-series patterns. TSDCA involves the compression of high-dimensional time series into a lower-dimensional space using a set of orthogonal transformations and the calculation of posterior probabilities based on a continuous-density hidden Markov model that incorporates a Gaussian mixture model expressed in the reduced-dimensional space. The analysis can be incorporated into a neural network so that parameters can be obtained appropriately as network coefficients according to backpropagation-through-time-based training algorithm. The network is considered to enable high-accuracy classification of high-dimensional time-series patterns and to reduce the computation time taken for network training. In the experiments conducted during the study, the validity of the proposed network was demonstrated for EEG signals.

  11. Research on digital multi-channel pulse height analysis techniques

    International Nuclear Information System (INIS)

    Xiao Wuyun; Wei Yixiang; Ai Xianyun; Ao Qi

    2005-01-01

    Multi-channel pulse height analysis techniques are developing in the direction of digitalization. Based on digital signal processing techniques, digital multi-channel analyzers are characterized by powerful pulse processing ability, high throughput, improved stability and flexibility. This paper analyzes key techniques of digital nuclear pulse processing. With MATLAB software, main algorithms are simulated, such as trapezoidal shaping, digital baseline estimation, digital pole-zero/zero-pole compensation, poles and zeros identification. The preliminary general scheme of digital MCA is discussed, as well as some other important techniques about its engineering design. All these lay the foundation of developing homemade digital nuclear spectrometers. (authors)

  12. Fractal analysis and nonlinear forecasting of indoor 222Rn time series

    International Nuclear Information System (INIS)

    Pausch, G.; Bossew, P.; Hofmann, W.; Steger, F.

    1998-01-01

    Fractal analyses of indoor 222 Rn time series were performed using different chaos theory based measurements such as time delay method, Hurst's rescaled range analysis, capacity (fractal) dimension, and Lyapunov exponent. For all time series we calculated only positive Lyapunov exponents which is a hint to chaos, while the Hurst exponents were well below 0.5, indicating antipersistent behaviour (past trends tend to reverse in the future). These time series were also analyzed with a nonlinear prediction method which allowed an estimation of the embedding dimensions with some restrictions, limiting the prediction to about three relative time steps. (orig.)

  13. Application of Synthetic Storm Technique for Diurnal and Seasonal Variation of Slant Path Ka-Band Rain Attenuation Time Series over a Subtropical Location in South Africa

    Directory of Open Access Journals (Sweden)

    J. S. Ojo

    2015-01-01

    Full Text Available As technology advances and more demands are on satellite services, rain-induced attenuation still creates one of the most damaging effects of the atmosphere on the quality of radio communication signals, especially those operating above 10 GHz. System designers therefore require statistical information on rain-induced attenuation over the coverage area in order to determine the appropriate transmitter and receiver characteristics to be adopted. This paper presents results on the time-varying rain characterization and diurnal variation of slant path rain attenuation in the Ka-band frequency simulated with synthetic storm techniques over a subtropical location in South Africa using 10-year rain rate time-series data. The analysis is based on the CDF of one-minute rain rate; time-series seasonal variation of rain rate observed over four time intervals: 00:00–06:00, 06:00–12:00, 12:00–18:00, and 18:00–24:00; diurnal fades margin; and diurnal variation of rain attenuation. Comparison was also made between the synthesized values and measured attenuation data. The predicted statistics are in good agreement with those obtained from the propagation beacon measurement in the area. The overall results will be needed for an acceptable planning that can effectively reduce the fade margin to a very low value for an optimum data communication over this area.

  14. Sensitivity analysis and related analysis : A survey of statistical techniques

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    1995-01-01

    This paper reviews the state of the art in five related types of analysis, namely (i) sensitivity or what-if analysis, (ii) uncertainty or risk analysis, (iii) screening, (iv) validation, and (v) optimization. The main question is: when should which type of analysis be applied; which statistical

  15. Development of environmental sample analysis techniques for safeguards

    International Nuclear Information System (INIS)

    Magara, Masaaki; Hanzawa, Yukiko; Esaka, Fumitaka

    1999-01-01

    JAERI has been developing environmental sample analysis techniques for safeguards and preparing a clean chemistry laboratory with clean rooms. Methods to be developed are a bulk analysis and a particle analysis. In the bulk analysis, Inductively-Coupled Plasma Mass Spectrometer or Thermal Ionization Mass Spectrometer are used to measure nuclear materials after chemical treatment of sample. In the particle analysis, Electron Probe Micro Analyzer and Secondary Ion Mass Spectrometer are used for elemental analysis and isotopic analysis, respectively. The design of the clean chemistry laboratory has been carried out and construction will be completed by the end of March, 2001. (author)

  16. Key-space analysis of double random phase encryption technique

    Science.gov (United States)

    Monaghan, David S.; Gopinathan, Unnikrishnan; Naughton, Thomas J.; Sheridan, John T.

    2007-09-01

    We perform a numerical analysis on the double random phase encryption/decryption technique. The key-space of an encryption technique is the set of possible keys that can be used to encode data using that technique. In the case of a strong encryption scheme, many keys must be tried in any brute-force attack on that technique. Traditionally, designers of optical image encryption systems demonstrate only how a small number of arbitrary keys cannot decrypt a chosen encrypted image in their system. However, this type of demonstration does not discuss the properties of the key-space nor refute the feasibility of an efficient brute-force attack. To clarify these issues we present a key-space analysis of the technique. For a range of problem instances we plot the distribution of decryption errors in the key-space indicating the lack of feasibility of a simple brute-force attack.

  17. CROSAT: A digital computer program for statistical-spectral analysis of two discrete time series

    International Nuclear Information System (INIS)

    Antonopoulos Domis, M.

    1978-03-01

    The program CROSAT computes directly from two discrete time series auto- and cross-spectra, transfer and coherence functions, using a Fast Fourier Transform subroutine. Statistical analysis of the time series is optional. While of general use the program is constructed to be immediately compatible with the ICL 4-70 and H316 computers at AEE Winfrith, and perhaps with minor modifications, with any other hardware system. (author)

  18. Comparison of Eight Techniques for Reconstructing Multi-Satellite Sensor Time-Series NDVI Data Sets in the Heihe River Basin, China

    Directory of Open Access Journals (Sweden)

    Liying Geng

    2014-03-01

    Full Text Available More than 20 techniques have been developed to de-noise time-series vegetation index data from different satellite sensors to reconstruct long time-series data sets. Although many studies have compared Normalized Difference Vegetation Index (NDVI noise-reduction techniques, few studies have compared these techniques systematically and comprehensively. This study tested eight techniques for smoothing different vegetation types using different types of multi-temporal NDVI data (Advanced Very High Resolution Radiometer (AVHRR (Global Inventory Modeling and Map Studies (GIMMS and Pathfinder AVHRR Land (PAL, Satellite Pour l’ Observation de la Terre (SPOT VEGETATION (VGT, and Moderate Resolution Imaging Spectroradiometer (MODIS (Terra with the ultimate purpose of determining the best reconstruction technique for each type of vegetation captured with four satellite sensors. These techniques include the modified best index slope extraction (M-BISE technique, the Savitzky-Golay (S-G technique, the mean value iteration filter (MVI technique, the asymmetric Gaussian (A-G technique, the double logistic (D-L technique, the changing-weight filter (CW technique, the interpolation for data reconstruction (IDR technique, and the Whittaker smoother (WS technique. These techniques were evaluated by calculating the root mean square error (RMSE, the Akaike Information Criterion (AIC, and the Bayesian Information Criterion (BIC. The results indicate that the S-G, CW, and WS techniques perform better than the other tested techniques, while the IDR, M-BISE, and MVI techniques performed worse than the other techniques. The best de-noise technique varies with different vegetation types and NDVI data sources. The S-G performs best in most situations. In addition, the CW and WS are effective techniques that were exceeded only by the S-G technique. The assessment results are consistent in terms of the three evaluation indexes for GIMMS, PAL, and SPOT data in the study

  19. Nuclear techniques for bulk and surface analysis of materials

    International Nuclear Information System (INIS)

    D'Agostino, M.D.; Kamykowski, E.A.; Kuehne, F.J.; Padawer, G.M.; Schneid, E.J.; Schulte, R.L.; Stauber, M.C.; Swanson, F.R.

    1978-01-01

    A review is presented summarizing several nondestructive bulk and surface analysis nuclear techniques developed in the Grumman Research Laboratories. Bulk analysis techniques include 14-MeV-neutron activation analysis and accelerator-based neutron radiography. The surface analysis techniques include resonant and non-resonant nuclear microprobes for the depth profile analysis of light elements (H, He, Li, Be, C, N, O and F) in the surface of materials. Emphasis is placed on the description and discussion of the unique nuclear microprobe analytical capacibilities of immediate importance to a number of current problems facing materials specialists. The resolution and contrast of neutron radiography was illustrated with an operating heat pipe system. The figure shows that the neutron radiograph has a resolution of better than 0.04 cm with sufficient contrast to indicate Freon 21 on the inner capillaries of the heat pipe and pooling of the liquid at the bottom. (T.G.)

  20. A Comparison of Pseudo-Maximum Likelihood and Asymptotically Distribution-Free Dynamic Factor Analysis Parameter Estimation in Fitting Covariance-Structure Models to Block-Toeplitz Representing Single-Subject Multivariate Time-Series

    NARCIS (Netherlands)

    Molenaar, P.C.M.; Nesselroade, J.R.

    1998-01-01

    The study of intraindividual variability pervades empirical inquiry in virtually all subdisciplines of psychology. The statistical analysis of multivariate time-series data - a central product of intraindividual investigations - requires special modeling techniques. The dynamic factor model (DFM),

  1. Romanian medieval earring analysis by X-ray fluorescence technique

    International Nuclear Information System (INIS)

    Therese, Laurent; Guillot, Philippe; Muja, Cristina

    2011-01-01

    Full text: Several instrumental techniques of elemental analysis are now used for the characterization of archaeological materials. The combination between archaeological and analytical information can provide significant knowledge on the constituting material origin, heritage authentication and restoration, provenance, migration, social interaction and exchange. Surface mapping techniques such as X-Ray Fluorescence have become a powerful tool for obtaining qualitative and semi-quantitative information about the chemical composition of cultural heritage materials, including metallic archaeological objects. In this study, the material comes from the Middle Age cemetery of Feldioara (Romania). The excavation of the site located between the evangelical church and the parsonage led to the discovery of several funeral artifacts in 18 graves among a total of 127 excavated. Even if the inventory was quite poor, some of the objects helped in establishing the chronology. Six anonymous Hungarian denarii (silver coins) were attributed to Geza II (1141-1161) and Stefan III (1162-1172), placing the cemetery in the second half of the XII century. This period was also confirmed by three loop shaped earrings with the end in 'S' form (one small and two large earrings). The small earring was found during the excavation in grave number 86, while the two others were discovered together in grave number 113. The anthropological study shown that skeletons excavated from graves 86 and 113 belonged respectively to a child (1 individual, medium level preservation, 9 months +/- 3 months) and to an adult (1 individual). In this work, elemental mapping were obtained by X-ray fluorescence (XRF) technique from Jobin Yvon Horiba XGT-5000 instrument offering detailed elemental images with a spatial resolution of 100μm. The analysis revealed that the earrings were composed of copper, zinc and tin as major elements. Minor elements were also determined. The comparison between the two large earrings

  2. Romanian medieval earring analysis by X-ray fluorescence technique

    Energy Technology Data Exchange (ETDEWEB)

    Therese, Laurent; Guillot, Philippe, E-mail: philippe.guillot@univ-jfc.fr [Laboratoire Diagnostics des Plasmas, CUFR J.F.C, Albi (France); Muja, Cristina [Laboratoire Diagnostics des Plasmas, CUFR J.F.C, Albi (France); Faculty of Biology, University of Bucharest (Romania); Vasile Parvan Institute of Archaeology, Bucharest, (Romania)

    2011-07-01

    Full text: Several instrumental techniques of elemental analysis are now used for the characterization of archaeological materials. The combination between archaeological and analytical information can provide significant knowledge on the constituting material origin, heritage authentication and restoration, provenance, migration, social interaction and exchange. Surface mapping techniques such as X-Ray Fluorescence have become a powerful tool for obtaining qualitative and semi-quantitative information about the chemical composition of cultural heritage materials, including metallic archaeological objects. In this study, the material comes from the Middle Age cemetery of Feldioara (Romania). The excavation of the site located between the evangelical church and the parsonage led to the discovery of several funeral artifacts in 18 graves among a total of 127 excavated. Even if the inventory was quite poor, some of the objects helped in establishing the chronology. Six anonymous Hungarian denarii (silver coins) were attributed to Geza II (1141-1161) and Stefan III (1162-1172), placing the cemetery in the second half of the XII century. This period was also confirmed by three loop shaped earrings with the end in 'S' form (one small and two large earrings). The small earring was found during the excavation in grave number 86, while the two others were discovered together in grave number 113. The anthropological study shown that skeletons excavated from graves 86 and 113 belonged respectively to a child (1 individual, medium level preservation, 9 months +/- 3 months) and to an adult (1 individual). In this work, elemental mapping were obtained by X-ray fluorescence (XRF) technique from Jobin Yvon Horiba XGT-5000 instrument offering detailed elemental images with a spatial resolution of 100{mu}m. The analysis revealed that the earrings were composed of copper, zinc and tin as major elements. Minor elements were also determined. The comparison between the two

  3. An Interactive Analysis of Hyperboles in a British TV Series: Implications For EFL Classes

    Science.gov (United States)

    Sert, Olcay

    2008-01-01

    This paper, part of an ongoing study on the analysis of hyperboles in a British TV series, reports findings drawing upon a 90,000 word corpus. The findings are compared to the ones from CANCODE (McCarthy and Carter 2004), a five-million word corpus of spontaneous speech, in order to identify similarities between the two. The analysis showed that…

  4. Mapping Mountain Pine Beetle Mortality through Growth Trend Analysis of Time-Series Landsat Data

    Directory of Open Access Journals (Sweden)

    Lu Liang

    2014-06-01

    Full Text Available Disturbances are key processes in the carbon cycle of forests and other ecosystems. In recent decades, mountain pine beetle (MPB; Dendroctonus ponderosae outbreaks have become more frequent and extensive in western North America. Remote sensing has the ability to fill the data gaps of long-term infestation monitoring, but the elimination of observational noise and attributing changes quantitatively are two main challenges in its effective application. Here, we present a forest growth trend analysis method that integrates Landsat temporal trajectories and decision tree techniques to derive annual forest disturbance maps over an 11-year period. The temporal trajectory component successfully captures the disturbance events as represented by spectral segments, whereas decision tree modeling efficiently recognizes and attributes events based upon the characteristics of the segments. Validated against a point set sampled across a gradient of MPB mortality, 86.74% to 94.00% overall accuracy was achieved with small variability in accuracy among years. In contrast, the overall accuracies of single-date classifications ranged from 37.20% to 75.20% and only become comparable with our approach when the training sample size was increased at least four-fold. This demonstrates that the advantages of this time series work flow exist in its small training sample size requirement. The easily understandable, interpretable and modifiable characteristics of our approach suggest that it could be applicable to other ecoregions.

  5. Absolute high-resolution Se+ photoionization cross-section measurements with Rydberg-series analysis

    International Nuclear Information System (INIS)

    Esteves, D. A.; Bilodeau, R. C.; Sterling, N. C.; Phaneuf, R. A.; Kilcoyne, A. L. D.; Red, E. C.; Aguilar, A.

    2011-01-01

    Absolute single photoionization cross-section measurements for Se + ions were performed at the Advanced Light Source (ALS) at Lawrence Berkeley National Laboratory using the photo-ion merged-beams technique. Measurements were made at a photon energy resolution of 5.5 meV from 17.75 to 21.85 eV spanning the 4s 2 4p 3 4 S 3/2 o ground-state ionization threshold and the 2 P 3/2 o , 2 P 1/2 o , 2 D 5/2 o , and 2 D 3/2 o metastable state thresholds. Extensive analysis of the complex resonant structure in this region identified numerous Rydberg series of resonances and obtained the Se 2+ 4s 2 4p 23 P 2 and 4s 2 4p 21 S 0 state energies. In addition, particular attention was given to removing significant effects in the measurements due to a small percentage of higher-order undulator radiation.

  6. The effect of the series resistance in dye-sensitized solar cells explored by electron transport and back reaction using electrical and optical modulation techniques

    International Nuclear Information System (INIS)

    Liu Weiqing; Hu Linhua; Dai Songyuan; Guo Lei; Jiang Nianquan; Kou Dongxing

    2010-01-01

    The influence of the series resistance on the electron transport and recombination processes in dye-sensitized solar cells (DSC) has been investigated. The series resistances induced by some parts of DSC, such as the transparent conductive oxide (TCO), the electrolyte layer and the counter electrode, influence the performance of DSC. By combining three frequency-domain techniques, specifically electrochemical impedance spectroscopy (EIS), intensity modulated photocurrent spectroscopy (IMPS) and intensity modulated photovoltage spectroscopy (IMVS), we studied the relationship between the series resistance and the dynamic response of DSC. The results show that the series resistance induced by the TCO or counter electrode predominantly affects the electron transport under short circuit conditions and has no significant influence on the recombination under open circuit conditions. However, the resistance related to the electrolyte layer not only limits the carrier transport but also influences the recombination. Possible reasons for the influence of the series resistance on the electron transport and recombination processes in DSC are also discussed.

  7. Meta-analysis in a nutshell: Techniques and general findings

    DEFF Research Database (Denmark)

    Paldam, Martin

    2015-01-01

    The purpose of this article is to introduce the technique and main findings of meta-analysis to the reader, who is unfamiliar with the field and has the usual objections. A meta-analysis is a quantitative survey of a literature reporting estimates of the same parameter. The funnel showing...

  8. 48 CFR 15.404-1 - Proposal analysis techniques.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Proposal analysis techniques. 15.404-1 Section 15.404-1 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION... assistance of other experts to ensure that an appropriate analysis is performed. (6) Recommendations or...

  9. NMR and modelling techniques in structural and conformation analysis

    Energy Technology Data Exchange (ETDEWEB)

    Abraham, R J [Liverpool Univ. (United Kingdom)

    1994-12-31

    The use of Lanthanide Induced Shifts (L.I.S.) and modelling techniques in conformational analysis is presented. The use of Co{sup III} porphyrins as shift reagents is discussed, with examples of their use in the conformational analysis of some heterocyclic amines. (author) 13 refs., 9 figs.

  10. Application of nuclear analysis techniques in ancient chinese porcelain

    International Nuclear Information System (INIS)

    Feng Songlin; Xu Qing; Feng Xiangqian; Lei Yong; Cheng Lin; Wang Yanqing

    2005-01-01

    Ancient ceramic was fired with porcelain clay. It contains various provenance information and age characteristic. It is the scientific foundation of studying Chinese porcelain to analyze and research the ancient ceramic with modern analysis methods. According to the property of nuclear analysis technique, its function and application are discussed. (authors)

  11. SWOT ANALYSIS-MANAGEMENT TECHNIQUES TO STREAMLINE PUBLIC BUSINESS MANAGEMENT

    OpenAIRE

    Rodica IVORSCHI

    2012-01-01

    SWOT analysis is the most important management techniques for understanding the strategic position of an organization. Objective SWOT analysis is to recommend strategies to ensure the best alignment between internal and external environment, and choosing the right strategy can be benefi cial organization in order to adapt their strengths to opportunities, minimize risks and eliminate weaknesses.

  12. SWOT ANALYSIS-MANAGEMENT TECHNIQUES TO STREAMLINE PUBLIC BUSINESS MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Rodica IVORSCHI

    2012-06-01

    Full Text Available SWOT analysis is the most important management techniques for understanding the strategic position of an organization.Objective SWOT analysis is to recommend strategies to ensure the best alignment between internal and external environment, and choosing the right strategy can be beneficial organization in order to adapt their strengths to opportunities, minimize risks and eliminate weaknesses.

  13. Kinematics analysis technique fouettes 720° classic ballet.

    Directory of Open Access Journals (Sweden)

    Li Bo

    2011-07-01

    Full Text Available Athletics practice proved that the more complex the item, the more difficult technique of the exercises. Fouettes at 720° one of the most difficult types of the fouettes. Its implementation is based on high technology during rotation of the performer. To perform this element not only requires good physical condition of the dancer, but also requires possession correct technique dancer. On the basis corresponding kinematic theory in this study, qualitative analysis and quantitative assessment of fouettes at 720 by the best Chinese dancers. For analysis, was taken the method of stereoscopic images and the theoretical analysis.

  14. Original technique for penile girth augmentation through porcine dermal acellular grafts: results in a 69-patient series.

    Science.gov (United States)

    Alei, Giovanni; Letizia, Piero; Ricottilli, Francesco; Simone, Pierfranco; Alei, Lavinia; Massoni, Francesco; Ricci, Serafino

    2012-07-01

    Although different techniques for augmentation phalloplasty have been reported in the medical literature, this issue is still highly controversial, and none of the proposed procedures has been unanimously approved. The aim of this study is to describe an innovative surgical technique for penile girth augmentation with porcine dermal acellular grafts, through a small transverse incision at the penile base, along the penopubic junction. Between 2000 and 2009, 104 patients were referred to our institution for penile enhancement. After a preoperative psychosexual consultation and a general medical assessment, 69 patients were deemed suitable good candidates for surgery. The average penis circumference was measured at the mid-length of the penis and was 8.1 cm (5.4-10.7 cm) and 10.8 cm (6.5-15.8 cm) during flaccidity and erection, respectively. All patients received penile augmentation with porcine dermal acellular grafts. Results evaluation of an innovative technique for penile girth augmentation through exogenous porcine grafts and small penobubic incision. Postoperative measurements were performed at 6 and 12 months. At the 1-year follow-up, the average penis circumference was 11.3 cm (8.2-13.2 cm, 3.1 cm mean increase) during flaccidity and 13.2 cm (8.8-14.5 cm, 2.4 cm mean increase) during erection. No major complications occurred in the series. Minor complications were resolved with conservative treatment within 3 weeks. Sexual activity was resumed from 1 to 2 months after surgery. The psychosexual impact of the operation was beneficial in the majority of cases. Penile girth enlargement with acellular dermal matrix grafts has several advantages over augmentation with autogenous dermis-fat grafts: the elimination of donor site morbidity and a significantly shorter operation time. With this approach, through a short dorsal incision at the base of the penis, the scar is concealed in a crease covered by pubic hair and thus hardly visible. © 2012

  15. A Markovian Entropy Measure for the Analysis of Calcium Activity Time Series.

    Science.gov (United States)

    Marken, John P; Halleran, Andrew D; Rahman, Atiqur; Odorizzi, Laura; LeFew, Michael C; Golino, Caroline A; Kemper, Peter; Saha, Margaret S

    2016-01-01

    Methods to analyze the dynamics of calcium activity often rely on visually distinguishable features in time series data such as spikes, waves, or oscillations. However, systems such as the developing nervous system display a complex, irregular type of calcium activity which makes the use of such methods less appropriate. Instead, for such systems there exists a class of methods (including information theoretic, power spectral, and fractal analysis approaches) which use more fundamental properties of the time series to analyze the observed calcium dynamics. We present a new analysis method in this class, the Markovian Entropy measure, which is an easily implementable calcium time series analysis method which represents the observed calcium activity as a realization of a Markov Process and describes its dynamics in terms of the level of predictability underlying the transitions between the states of the process. We applied our and other commonly used calcium analysis methods on a dataset from Xenopus laevis neural progenitors which displays irregular calcium activity and a dataset from murine synaptic neurons which displays activity time series that are well-described by visually-distinguishable features. We find that the Markovian Entropy measure is able to distinguish between biologically distinct populations in both datasets, and that it can separate biologically distinct populations to a greater extent than other methods in the dataset exhibiting irregular calcium activity. These results support the benefit of using the Markovian Entropy measure to analyze calcium dynamics, particularly for studies using time series data which do not exhibit easily distinguishable features.

  16. A Markovian Entropy Measure for the Analysis of Calcium Activity Time Series.

    Directory of Open Access Journals (Sweden)

    John P Marken

    Full Text Available Methods to analyze the dynamics of calcium activity often rely on visually distinguishable features in time series data such as spikes, waves, or oscillations. However, systems such as the developing nervous system display a complex, irregular type of calcium activity which makes the use of such methods less appropriate. Instead, for such systems there exists a class of methods (including information theoretic, power spectral, and fractal analysis approaches which use more fundamental properties of the time series to analyze the observed calcium dynamics. We present a new analysis method in this class, the Markovian Entropy measure, which is an easily implementable calcium time series analysis method which represents the observed calcium activity as a realization of a Markov Process and describes its dynamics in terms of the level of predictability underlying the transitions between the states of the process. We applied our and other commonly used calcium analysis methods on a dataset from Xenopus laevis neural progenitors which displays irregular calcium activity and a dataset from murine synaptic neurons which displays activity time series that are well-described by visually-distinguishable features. We find that the Markovian Entropy measure is able to distinguish between biologically distinct populations in both datasets, and that it can separate biologically distinct populations to a greater extent than other methods in the dataset exhibiting irregular calcium activity. These results support the benefit of using the Markovian Entropy measure to analyze calcium dynamics, particularly for studies using time series data which do not exhibit easily distinguishable features.

  17. PREFACE: 14th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2011)

    Science.gov (United States)

    Teodorescu, Liliana; Britton, David; Glover, Nigel; Heinrich, Gudrun; Lauret, Jérôme; Naumann, Axel; Speer, Thomas; Teixeira-Dias, Pedro

    2012-06-01

    ACAT2011 This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 14th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2011) which took place on 5-7 September 2011 at Brunel University, UK. The workshop series, which began in 1990 in Lyon, France, brings together computer science researchers and practitioners, and researchers from particle physics and related fields in order to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. It is a forum for the exchange of ideas among the fields, exploring and promoting cutting-edge computing, data analysis and theoretical calculation techniques in fundamental physics research. This year's edition of the workshop brought together over 100 participants from all over the world. 14 invited speakers presented key topics on computing ecosystems, cloud computing, multivariate data analysis, symbolic and automatic theoretical calculations as well as computing and data analysis challenges in astrophysics, bioinformatics and musicology. Over 80 other talks and posters presented state-of-the art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. Panel and round table discussions on data management and multivariate data analysis uncovered new ideas and collaboration opportunities in the respective areas. This edition of ACAT was generously sponsored by the Science and Technology Facility Council (STFC), the Institute for Particle Physics Phenomenology (IPPP) at Durham University, Brookhaven National Laboratory in the USA and Dell. We would like to thank all the participants of the workshop for the high level of their scientific contributions and for the enthusiastic participation in all its activities which were, ultimately, the key factors in the

  18. Time-series analysis to study the impact of an intersection on dispersion along a street canyon.

    Science.gov (United States)

    Richmond-Bryant, Jennifer; Eisner, Alfred D; Hahn, Intaek; Fortune, Christopher R; Drake-Richman, Zora E; Brixey, Laurie A; Talih, M; Wiener, Russell W; Ellenson, William D

    2009-12-01

    This paper presents data analysis from the Brooklyn Traffic Real-Time Ambient Pollutant Penetration and Environmental Dispersion (B-TRAPPED) study to assess the transport of ultrafine particulate matter (PM) across urban intersections. Experiments were performed in a street canyon perpendicular to a highway in Brooklyn, NY, USA. Real-time ultrafine PM samplers were positioned on either side of an intersection at multiple locations along a street to collect time-series number concentration data. Meteorology equipment was positioned within the street canyon and at an upstream background site to measure wind speed and direction. Time-series analysis was performed on the PM data to compute a transport velocity along the direction of the street for the cases where background winds were parallel and perpendicular to the street. The data were analyzed for sampler pairs located (1) on opposite sides of the intersection and (2) on the same block. The time-series analysis demonstrated along-street transport, including across the intersection when background winds were parallel to the street canyon and there was minimal transport and no communication across the intersection when background winds were perpendicular to the street canyon. Low but significant values of the cross-correlation function (CCF) underscore the turbulent nature of plume transport along the street canyon. The low correlations suggest that flow switching around corners or traffic-induced turbulence at the intersection may have aided dilution of the PM plume from the highway. This observation supports similar findings in the literature. Furthermore, the time-series analysis methodology applied in this study is introduced as a technique for studying spatiotemporal variation in the urban microscale environment.

  19. ESTIMATING RELIABILITY OF DISTURBANCES IN SATELLITE TIME SERIES DATA BASED ON STATISTICAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Z.-G. Zhou

    2016-06-01

    Full Text Available Normally, the status of land cover is inherently dynamic and changing continuously on temporal scale. However, disturbances or abnormal changes of land cover — caused by such as forest fire, flood, deforestation, and plant diseases — occur worldwide at unknown times and locations. Timely detection and characterization of these disturbances is of importance for land cover monitoring. Recently, many time-series-analysis methods have been developed for near real-time or online disturbance detection, using satellite image time series. However, the detection results were only labelled with “Change/ No change” by most of the present methods, while few methods focus on estimating reliability (or confidence level of the detected disturbances in image time series. To this end, this paper propose a statistical analysis method for estimating reliability of disturbances in new available remote sensing image time series, through analysis of full temporal information laid in time series data. The method consists of three main steps. (1 Segmenting and modelling of historical time series data based on Breaks for Additive Seasonal and Trend (BFAST. (2 Forecasting and detecting disturbances in new time series data. (3 Estimating reliability of each detected disturbance using statistical analysis based on Confidence Interval (CI and Confidence Levels (CL. The method was validated by estimating reliability of disturbance regions caused by a recent severe flooding occurred around the border of Russia and China. Results demonstrated that the method can estimate reliability of disturbances detected in satellite image with estimation error less than 5% and overall accuracy up to 90%.

  20. Trend analysis and change point detection of annual and seasonal temperature series in Peninsular Malaysia

    Science.gov (United States)

    Suhaila, Jamaludin; Yusop, Zulkifli

    2017-06-01

    Most of the trend analysis that has been conducted has not considered the existence of a change point in the time series analysis. If these occurred, then the trend analysis will not be able to detect an obvious increasing or decreasing trend over certain parts of the time series. Furthermore, the lack of discussion on the possible factors that influenced either the decreasing or the increasing trend in the series needs to be addressed in any trend analysis. Hence, this study proposes to investigate the trends, and change point detection of mean, maximum and minimum temperature series, both annually and seasonally in Peninsular Malaysia and determine the possible factors that could contribute to the significance trends. In this study, Pettitt and sequential Mann-Kendall (SQ-MK) tests were used to examine the occurrence of any abrupt climate changes in the independent series. The analyses of the abrupt changes in temperature series suggested that most of the change points in Peninsular Malaysia were detected during the years 1996, 1997 and 1998. These detection points captured by Pettitt and SQ-MK tests are possibly related to climatic factors, such as El Niño and La Niña events. The findings also showed that the majority of the significant change points that exist in the series are related to the significant trend of the stations. Significant increasing trends of annual and seasonal mean, maximum and minimum temperatures in Peninsular Malaysia were found with a range of 2-5 °C/100 years during the last 32 years. It was observed that the magnitudes of the increasing trend in minimum temperatures were larger than the maximum temperatures for most of the studied stations, particularly at the urban stations. These increases are suspected to be linked with the effect of urban heat island other than El Niño event.

  1. Phase synchronization based minimum spanning trees for analysis of financial time series with nonlinear correlations

    Science.gov (United States)

    Radhakrishnan, Srinivasan; Duvvuru, Arjun; Sultornsanee, Sivarit; Kamarthi, Sagar

    2016-02-01

    The cross correlation coefficient has been widely applied in financial time series analysis, in specific, for understanding chaotic behaviour in terms of stock price and index movements during crisis periods. To better understand time series correlation dynamics, the cross correlation matrices are represented as networks, in which a node stands for an individual time series and a link indicates cross correlation between a pair of nodes. These networks are converted into simpler trees using different schemes. In this context, Minimum Spanning Trees (MST) are the most favoured tree structures because of their ability to preserve all the nodes and thereby retain essential information imbued in the network. Although cross correlations underlying MSTs capture essential information, they do not faithfully capture dynamic behaviour embedded in the time series data of financial systems because cross correlation is a reliable measure only if the relationship between the time series is linear. To address the issue, this work investigates a new measure called phase synchronization (PS) for establishing correlations among different time series which relate to one another, linearly or nonlinearly. In this approach the strength of a link between a pair of time series (nodes) is determined by the level of phase synchronization between them. We compare the performance of phase synchronization based MST with cross correlation based MST along selected network measures across temporal frame that includes economically good and crisis periods. We observe agreement in the directionality of the results across these two methods. They show similar trends, upward or downward, when comparing selected network measures. Though both the methods give similar trends, the phase synchronization based MST is a more reliable representation of the dynamic behaviour of financial systems than the cross correlation based MST because of the former's ability to quantify nonlinear relationships among time

  2. Statistical performance and information content of time lag analysis and redundancy analysis in time series modeling.

    Science.gov (United States)

    Angeler, David G; Viedma, Olga; Moreno, José M

    2009-11-01

    Time lag analysis (TLA) is a distance-based approach used to study temporal dynamics of ecological communities by measuring community dissimilarity over increasing time lags. Despite its increased use in recent years, its performance in comparison with other more direct methods (i.e., canonical ordination) has not been evaluated. This study fills this gap using extensive simulations and real data sets from experimental temporary ponds (true zooplankton communities) and landscape studies (landscape categories as pseudo-communities) that differ in community structure and anthropogenic stress history. Modeling time with a principal coordinate of neighborhood matrices (PCNM) approach, the canonical ordination technique (redundancy analysis; RDA) consistently outperformed the other statistical tests (i.e., TLAs, Mantel test, and RDA based on linear time trends) using all real data. In addition, the RDA-PCNM revealed different patterns of temporal change, and the strength of each individual time pattern, in terms of adjusted variance explained, could be evaluated, It also identified species contributions to these patterns of temporal change. This additional information is not provided by distance-based methods. The simulation study revealed better Type I error properties of the canonical ordination techniques compared with the distance-based approaches when no deterministic component of change was imposed on the communities. The simulation also revealed that strong emphasis on uniform deterministic change and low variability at other temporal scales is needed to result in decreased statistical power of the RDA-PCNM approach relative to the other methods. Based on the statistical performance of and information content provided by RDA-PCNM models, this technique serves ecologists as a powerful tool for modeling temporal change of ecological (pseudo-) communities.

  3. Nuclear analysis techniques as a component of thermoluminescence dating

    Energy Technology Data Exchange (ETDEWEB)

    Prescott, J.R.; Hutton, J.T.; Habermehl, M.A. [Adelaide Univ., SA (Australia); Van Moort, J. [Tasmania Univ., Sandy Bay, TAS (Australia)

    1996-12-31

    In luminescence dating, an age is found by first measuring dose accumulated since the event being dated, then dividing by the annual dose rate. Analyses of minor and trace elements performed by nuclear techniques have long formed an essential component of dating. Results from some Australian sites are reported to illustrate the application of nuclear techniques of analysis in this context. In particular, a variety of methods for finding dose rates are compared, an example of a site where radioactive disequilibrium is significant and a brief summary is given of a problem which was not resolved by nuclear techniques. 5 refs., 2 tabs.

  4. Nuclear analysis techniques as a component of thermoluminescence dating

    Energy Technology Data Exchange (ETDEWEB)

    Prescott, J R; Hutton, J T; Habermehl, M A [Adelaide Univ., SA (Australia); Van Moort, J [Tasmania Univ., Sandy Bay, TAS (Australia)

    1997-12-31

    In luminescence dating, an age is found by first measuring dose accumulated since the event being dated, then dividing by the annual dose rate. Analyses of minor and trace elements performed by nuclear techniques have long formed an essential component of dating. Results from some Australian sites are reported to illustrate the application of nuclear techniques of analysis in this context. In particular, a variety of methods for finding dose rates are compared, an example of a site where radioactive disequilibrium is significant and a brief summary is given of a problem which was not resolved by nuclear techniques. 5 refs., 2 tabs.

  5. Application of pattern recognition techniques to crime analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bender, C.F.; Cox, L.A. Jr.; Chappell, G.A.

    1976-08-15

    The initial goal was to evaluate the capabilities of current pattern recognition techniques when applied to existing computerized crime data. Performance was to be evaluated both in terms of the system's capability to predict crimes and to optimize police manpower allocation. A relation was sought to predict the crime's susceptibility to solution, based on knowledge of the crime type, location, time, etc. The preliminary results of this work are discussed. They indicate that automatic crime analysis involving pattern recognition techniques is feasible, and that efforts to determine optimum variables and techniques are warranted. 47 figures (RWR)

  6. Determinants of Egyptian Banking Sector Profitability: Time-Series Analysis from 2004-2014

    Directory of Open Access Journals (Sweden)

    Heba Youssef Hashem

    2016-06-01

    Full Text Available Purpose - The purpose of this paper is to examine the determinants of banking sector profitability in Egypt to shed light on the most influential variables that have a significant impact on the performance of this vital sector. Design/methodology/approach - The analysis includes a time series model of quarterly data from 2004 to 2014. The model utilizes Cointegration technique to investigate the long-run relationship between the return on equity as a proxy for bank profitability and several bank-specific variables including liquidity, capital adequacy, and percentage of non-performing loans. In addition, Vector Error Correction Model (VECM is utilized to explore the short-run dynamics of the model and the speed of adjustment to reach the long-run equilibrium. Findings - The main findings of this work show that banking sector profitability is inversely related to capital adequacy, the percentage of loan provisions and the ratio of deposits to total assets. On the other hand, it is positively related to the size of the banking sector which implies that the banking sector exhibits economies of scale. Research limitations/implications - The implications of this work is that it helps reveal the major factors affecting bank performance in the short-run and long-run, and hence provide bank managers and monetary policy makers with beneficial insights on how to enhance bank performance. Since the banking sector represents one of the main engines of financing investment, enhancing the efficiency of this sector would contribute to economic growth and prosperity Originality/value - The Vector error correction model showed that about 4% of the disequilibrium is corrected each quarter to reach the long run equilibrium. In addition, all bank specific variables were found to affect profitability in the long-run only. This study would serve as a base that further work on Egyptian banking sector profitability can build on by incorporating more variables in the

  7. Time-Series Analysis of Remotely-Sensed SeaWiFS Chlorophyll in River-Influenced Coastal Regions

    Science.gov (United States)

    Acker, James G.; McMahon, Erin; Shen, Suhung; Hearty, Thomas; Casey, Nancy

    2009-01-01

    The availability of a nearly-continuous record of remotely-sensed chlorophyll a data (chl a) from the Sea-viewing Wide Field-of-view Sensor (SeaWiFS) mission, now longer than ten years, enables examination of time-series trends for multiple global locations. Innovative data analysis technology available on the World Wide Web facilitates such analyses. In coastal regions influenced by river outflows, chl a is not always indicative of actual trends in phytoplankton chlorophyll due to the interference of colored dissolved organic matter and suspended sediments; significant chl a timeseries trends for coastal regions influenced by river outflows may nonetheless be indicative of important alterations of the hydrologic and coastal environment. Chl a time-series analysis of nine marine regions influenced by river outflows demonstrates the simplicity and usefulness of this technique. The analyses indicate that coastal time-series are significantly influenced by unusual flood events. Major river systems in regions with relatively low human impact did not exhibit significant trends. Most river systems with demonstrated human impact exhibited significant negative trends, with the noteworthy exception of the Pearl River in China, which has a positive trend.

  8. Assessing error sources for Landsat time series analysis for tropical test sites in Viet Nam and Ethiopia

    Science.gov (United States)

    Schultz, Michael; Verbesselt, Jan; Herold, Martin; Avitabile, Valerio

    2013-10-01

    Researchers who use remotely sensed data can spend half of their total effort analysing prior data. If this data preprocessing does not match the application, this time spent on data analysis can increase considerably and can lead to inaccuracies. Despite the existence of a number of methods for pre-processing Landsat time series, each method has shortcomings, particularly for mapping forest changes under varying illumination, data availability and atmospheric conditions. Based on the requirements of mapping forest changes as defined by the United Nations (UN) Reducing Emissions from Forest Degradation and Deforestation (REDD) program, the accurate reporting of the spatio-temporal properties of these changes is necessary. We compared the impact of three fundamentally different radiometric preprocessing techniques Moderate Resolution Atmospheric TRANsmission (MODTRAN), Second Simulation of a Satellite Signal in the Solar Spectrum (6S) and simple Dark Object Subtraction (DOS) on mapping forest changes using Landsat time series data. A modification of Breaks For Additive Season and Trend (BFAST) monitor was used to jointly map the spatial and temporal agreement of forest changes at test sites in Ethiopia and Viet Nam. The suitability of the pre-processing methods for the occurring forest change drivers was assessed using recently captured Ground Truth and high resolution data (1000 points). A method for creating robust generic forest maps used for the sampling design is presented. An assessment of error sources has been performed identifying haze as a major source for time series analysis commission error.

  9. Applications of Electromigration Techniques: Applications of Electromigration Techniques in Food Analysis

    Science.gov (United States)

    Wieczorek, Piotr; Ligor, Magdalena; Buszewski, Bogusław

    Electromigration techniques, including capillary electrophoresis (CE), are widely used for separation and identification of compounds present in food products. These techniques may also be considered as alternate and complementary with respect to commonly used analytical techniques, such as high-performance liquid chromatography (HPLC), or gas chromatography (GC). Applications of CE concern the determination of high-molecular compounds, like polyphenols, including flavonoids, pigments, vitamins, food additives (preservatives, antioxidants, sweeteners, artificial pigments) are presented. Also, the method developed for the determination of proteins and peptides composed of amino acids, which are basic components of food products, are studied. Other substances such as carbohydrates, nucleic acids, biogenic amines, natural toxins, and other contaminations including pesticides and antibiotics are discussed. The possibility of CE application in food control laboratories, where analysis of the composition of food and food products are conducted, is of great importance. CE technique may be used during the control of technological processes in the food industry and for the identification of numerous compounds present in food. Due to the numerous advantages of the CE technique it is successfully used in routine food analysis.

  10. Improved Sectional Image Analysis Technique for Evaluating Fiber Orientations in Fiber-Reinforced Cement-Based Materials.

    Science.gov (United States)

    Lee, Bang Yeon; Kang, Su-Tae; Yun, Hae-Bum; Kim, Yun Yong

    2016-01-12

    The distribution of fiber orientation is an important factor in determining the mechanical properties of fiber-reinforced concrete. This study proposes a new image analysis technique for improving the evaluation accuracy of fiber orientation distribution in the sectional image of fiber-reinforced concrete. A series of tests on the accuracy of fiber detection and the estimation performance of fiber orientation was performed on artificial fiber images to assess the validity of the proposed technique. The validation test results showed that the proposed technique estimates the distribution of fiber orientation more accurately than the direct measurement of fiber orientation by image analysis.

  11. Time Series Data Analysis of Wireless Sensor Network Measurements of Temperature.

    Science.gov (United States)

    Bhandari, Siddhartha; Bergmann, Neil; Jurdak, Raja; Kusy, Branislav

    2017-05-26

    Wireless sensor networks have gained significant traction in environmental signal monitoring and analysis. The cost or lifetime of the system typically depends on the frequency at which environmental phenomena are monitored. If sampling rates are reduced, energy is saved. Using empirical datasets collected from environmental monitoring sensor networks, this work performs time series analyses of measured temperature time series. Unlike previous works which have concentrated on suppressing the transmission of some data samples by time-series analysis but still maintaining high sampling rates, this work investigates reducing the sampling rate (and sensor wake up rate) and looks at the effects on accuracy. Results show that the sampling period of the sensor can be increased up to one hour while still allowing intermediate and future states to be estimated with interpolation RMSE less than 0.2 °C and forecasting RMSE less than 1 °C.

  12. Multi-complexity ensemble measures for gait time series analysis: application to diagnostics, monitoring and biometrics.

    Science.gov (United States)

    Gavrishchaka, Valeriy; Senyukova, Olga; Davis, Kristina

    2015-01-01

    Previously, we have proposed to use complementary complexity measures discovered by boosting-like ensemble learning for the enhancement of quantitative indicators dealing with necessarily short physiological time series. We have confirmed robustness of such multi-complexity measures for heart rate variability analysis with the emphasis on detection of emerging and intermittent cardiac abnormalities. Recently, we presented preliminary results suggesting that such ensemble-based approach could be also effective in discovering universal meta-indicators for early detection and convenient monitoring of neurological abnormalities using gait time series. Here, we argue and demonstrate that these multi-complexity ensemble measures for gait time series analysis could have significantly wider application scope ranging from diagnostics and early detection of physiological regime change to gait-based biometrics applications.

  13. Automated thermal mapping techniques using chromatic image analysis

    Science.gov (United States)

    Buck, Gregory M.

    1989-01-01

    Thermal imaging techniques are introduced using a chromatic image analysis system and temperature sensitive coatings. These techniques are used for thermal mapping and surface heat transfer measurements on aerothermodynamic test models in hypersonic wind tunnels. Measurements are made on complex vehicle configurations in a timely manner and at minimal expense. The image analysis system uses separate wavelength filtered images to analyze surface spectral intensity data. The system was initially developed for quantitative surface temperature mapping using two-color thermographic phosphors but was found useful in interpreting phase change paint and liquid crystal data as well.

  14. Using Machine Learning Techniques in the Analysis of Oceanographic Data

    Science.gov (United States)

    Falcinelli, K. E.; Abuomar, S.

    2017-12-01

    Acoustic Doppler Current Profilers (ADCPs) are oceanographic tools capable of collecting large amounts of current profile data. Using unsupervised machine learning techniques such as principal component analysis, fuzzy c-means clustering, and self-organizing maps, patterns and trends in an ADCP dataset are found. Cluster validity algorithms such as visual assessment of cluster tendency and clustering index are used to determine the optimal number of clusters in the ADCP dataset. These techniques prove to be useful in analysis of ADCP data and demonstrate potential for future use in other oceanographic applications.

  15. Harmonic Analysis of a Nonstationary Series of Temperature Paleoreconstruction for the Central Part of Greenland

    Directory of Open Access Journals (Sweden)

    T.E. Danova

    2016-06-01

    Full Text Available The results of the investigations of a transformed series of reconstructed air temperature data for the central part of Greenland with an increment of 30 years have been presented. Stationarization of a ~ 50,000-years’ series of the reconstructed air temperature in the central part of Greenland according to ice core data has been performed using mathematical expectation. To obtain mathematical expectation estimation, the smoothing procedure by the methods of moving average and wavelet analysis has been carried out. Fourier’s transformation has been applied repeatedly to the stationarized series with changing the averaging time in the process of smoothing. Three averaging time values have been selected for the investigations: ~ 400–500 years, ~ 2,000 years, and ~ 4,000 years. Stationarization of the reconstructed temperature series with the help of wavelet transformation showed the best results when applying the averaging time of ~ 400 and ~ 2000 years, the trends well characterize the initial temperature series, there-by revealing the main patterns of its dynamics. Using the period with the averaging time of ~ 4,000 years showed the worst result: significant events of the main temperature series were lost in the process of averaging. The obtained results well correspond to cycling known to be inherent to the climatic system of the planet; the detected modes of 1,470 ± 500 years are comparable to the Dansgaard–Oeschger and Bond oscillations.

  16. Properties of Asymmetric Detrended Fluctuation Analysis in the time series of RR intervals

    Science.gov (United States)

    Piskorski, J.; Kosmider, M.; Mieszkowski, D.; Krauze, T.; Wykretowicz, A.; Guzik, P.

    2018-02-01

    Heart rate asymmetry is a phenomenon by which the accelerations and decelerations of heart rate behave differently, and this difference is consistent and unidirectional, i.e. in most of the analyzed recordings the inequalities have the same directions. So far, it has been established for variance and runs based types of descriptors of RR intervals time series. In this paper we apply the newly developed method of Asymmetric Detrended Fluctuation Analysis, which so far has mainly been used with economic time series, to the set of 420 stationary 30 min time series of RR intervals from young, healthy individuals aged between 20 and 40. This asymmetric approach introduces separate scaling exponents for rising and falling trends. We systematically study the presence of asymmetry in both global and local versions of this method. In this study global means "applying to the whole time series" and local means "applying to windows jumping along the recording". It is found that the correlation structure of the fluctuations left over after detrending in physiological time series shows strong asymmetric features in both magnitude, with α+ physiological data after shuffling or with a group of symmetric synthetic time series.

  17. The application of complex network time series analysis in turbulent heated jets

    International Nuclear Information System (INIS)

    Charakopoulos, A. K.; Karakasidis, T. E.; Liakopoulos, A.; Papanicolaou, P. N.

    2014-01-01

    In the present study, we applied the methodology of the complex network-based time series analysis to experimental temperature time series from a vertical turbulent heated jet. More specifically, we approach the hydrodynamic problem of discriminating time series corresponding to various regions relative to the jet axis, i.e., time series corresponding to regions that are close to the jet axis from time series originating at regions with a different dynamical regime based on the constructed network properties. Applying the transformation phase space method (k nearest neighbors) and also the visibility algorithm, we transformed time series into networks and evaluated the topological properties of the networks such as degree distribution, average path length, diameter, modularity, and clustering coefficient. The results show that the complex network approach allows distinguishing, identifying, and exploring in detail various dynamical regions of the jet flow, and associate it to the corresponding physical behavior. In addition, in order to reject the hypothesis that the studied networks originate from a stochastic process, we generated random network and we compared their statistical properties with that originating from the experimental data. As far as the efficiency of the two methods for network construction is concerned, we conclude that both methodologies lead to network properties that present almost the same qualitative behavior and allow us to reveal the underlying system dynamics

  18. Statistical attribution analysis of the nonstationarity of the annual runoff series of the Weihe River.

    Science.gov (United States)

    Xiong, Lihua; Jiang, Cong; Du, Tao

    2014-01-01

    Time-varying moments models based on Pearson Type III and normal distributions respectively are built under the generalized additive model in location, scale and shape (GAMLSS) framework to analyze the nonstationarity of the annual runoff series of the Weihe River, the largest tributary of the Yellow River. The detection of nonstationarities in hydrological time series (annual runoff, precipitation and temperature) from 1960 to 2009 is carried out using a GAMLSS model, and then the covariate analysis for the annual runoff series is implemented with GAMLSS. Finally, the attribution of each covariate to the nonstationarity of annual runoff is analyzed quantitatively. The results demonstrate that (1) obvious change-points exist in all three hydrological series, (2) precipitation, temperature and irrigated area are all significant covariates of the annual runoff series, and (3) temperature increase plays the main role in leading to the reduction of the annual runoff series in the study basin, followed by the decrease of precipitation and the increase of irrigated area.

  19. Windows forensic analysis toolkit advanced analysis techniques for Windows 7

    CERN Document Server

    Carvey, Harlan

    2012-01-01

    Now in its third edition, Harlan Carvey has updated "Windows Forensic Analysis Toolkit" to cover Windows 7 systems. The primary focus of this edition is on analyzing Windows 7 systems and on processes using free and open-source tools. The book covers live response, file analysis, malware detection, timeline, and much more. The author presents real-life experiences from the trenches, making the material realistic and showing the why behind the how. New to this edition, the companion and toolkit materials are now hosted online. This material consists of electronic printable checklists, cheat sheets, free custom tools, and walk-through demos. This edition complements "Windows Forensic Analysis Toolkit, 2nd Edition", (ISBN: 9781597494229), which focuses primarily on XP. It includes complete coverage and examples on Windows 7 systems. It contains Lessons from the Field, Case Studies, and War Stories. It features companion online material, including electronic printable checklists, cheat sheets, free custom tools, ...

  20. A Filtering of Incomplete GNSS Position Time Series with Probabilistic Principal Component Analysis

    Science.gov (United States)

    Gruszczynski, Maciej; Klos, Anna; Bogusz, Janusz

    2018-04-01

    For the first time, we introduced the probabilistic principal component analysis (pPCA) regarding the spatio-temporal filtering of Global Navigation Satellite System (GNSS) position time series to estimate and remove Common Mode Error (CME) without the interpolation of missing values. We used data from the International GNSS Service (IGS) stations which contributed to the latest International Terrestrial Reference Frame (ITRF2014). The efficiency of the proposed algorithm was tested on the simulated incomplete time series, then CME was estimated for a set of 25 stations located in Central Europe. The newly applied pPCA was compared with previously used algorithms, which showed that this method is capable of resolving the problem of proper spatio-temporal filtering of GNSS time series characterized by different observation time span. We showed, that filtering can be carried out with pPCA method when there exist two time series in the dataset having less than 100 common epoch of observations. The 1st Principal Component (PC) explained more than 36% of the total variance represented by time series residuals' (series with deterministic model removed), what compared to the other PCs variances (less than 8%) means that common signals are significant in GNSS residuals. A clear improvement in the spectral indices of the power-law noise was noticed for the Up component, which is reflected by an average shift towards white noise from - 0.98 to - 0.67 (30%). We observed a significant average reduction in the accuracy of stations' velocity estimated for filtered residuals by 35, 28 and 69% for the North, East, and Up components, respectively. CME series were also subjected to analysis in the context of environmental mass loading influences of the filtering results. Subtraction of the environmental loading models from GNSS residuals provides to reduction of the estimated CME variance by 20 and 65% for horizontal and vertical components, respectively.

  1. Conference on Techniques of Nuclear and Conventional Analysis and Applications

    International Nuclear Information System (INIS)

    2012-01-01

    Full text : With their wide scope, particularly in the areas of environment, geology, mining, industry and life sciences; analysis techniques are of great importance in research as fundamental and applied. The Conference on Techniques for Nuclear and Conventional Analysis and Applications (TANCA) are Registered in the national strategy of opening of the University and national research centers on their local, national and international levels. This conference aims to: Promoting nuclear and conventional analytical techniques; Contribute to the creation of synergy between the different players involved in these techniques include, Universities, Research Organizations, Regulatory Authorities, Economic Operators, NGOs and others; Inform and educate potential users of the performance of these techniques; Strengthen exchanges and links between researchers, industry and policy makers; Implement a program of inter-laboratory comparison between Moroccan one hand, and their foreign counterparts on the other; Contribute to the research training of doctoral students and postdoctoral scholars. Given the relevance and importance of the issues related to environment and impact on cultural heritage, this fourth edition of TANCA is devoted to the application of analytical techniques for conventional and nuclear Questions ied to environment and its impact on cultural heritage.

  2. A Recurrent Probabilistic Neural Network with Dimensionality Reduction Based on Time-series Discriminant Component Analysis.

    Science.gov (United States)

    Hayashi, Hideaki; Shibanoki, Taro; Shima, Keisuke; Kurita, Yuichi; Tsuji, Toshio

    2015-12-01

    This paper proposes a probabilistic neural network (NN) developed on the basis of time-series discriminant component analysis (TSDCA) that can be used to classify high-dimensional time-series patterns. TSDCA involves the compression of high-dimensional time series into a lower dimensional space using a set of orthogonal transformations and the calculation of posterior probabilities based on a continuous-density hidden Markov model with a Gaussian mixture model expressed in the reduced-dimensional space. The analysis can be incorporated into an NN, which is named a time-series discriminant component network (TSDCN), so that parameters of dimensionality reduction and classification can be obtained simultaneously as network coefficients according to a backpropagation through time-based learning algorithm with the Lagrange multiplier method. The TSDCN is considered to enable high-accuracy classification of high-dimensional time-series patterns and to reduce the computation time taken for network training. The validity of the TSDCN is demonstrated for high-dimensional artificial data and electroencephalogram signals in the experiments conducted during the study.

  3. Bayesian near-boundary analysis in basic macroeconomic time series models

    NARCIS (Netherlands)

    M.D. de Pooter (Michiel); F. Ravazzolo (Francesco); R. Segers (René); H.K. van Dijk (Herman)

    2008-01-01

    textabstractSeveral lessons learnt from a Bayesian analysis of basic macroeconomic time series models are presented for the situation where some model parameters have substantial posterior probability near the boundary of the parameter region. This feature refers to near-instability within dynamic

  4. A Comparison of Missing-Data Procedures for Arima Time-Series Analysis

    Science.gov (United States)

    Velicer, Wayne F.; Colby, Suzanne M.

    2005-01-01

    Missing data are a common practical problem for longitudinal designs. Time-series analysis is a longitudinal method that involves a large number of observations on a single unit. Four different missing-data methods (deletion, mean substitution, mean of adjacent observations, and maximum likelihood estimation) were evaluated. Computer-generated…

  5. Using trajectory sensitivity analysis to find suitable locations of series compensators for improving rotor angle stability

    DEFF Research Database (Denmark)

    Nasri, Amin; Eriksson, Robert; Ghandhar, Mehrdad

    2014-01-01

    This paper proposes an approach based on trajectory sensitivity analysis (TSA) to find most suitable placement of series compensators in the power system. The main objective is to maximize the benefit of these devices in order to enhance the rotor angle stability. This approach is formulated...

  6. Detrended partial cross-correlation analysis of two nonstationary time series influenced by common external forces

    Science.gov (United States)

    Qian, Xi-Yuan; Liu, Ya-Min; Jiang, Zhi-Qiang; Podobnik, Boris; Zhou, Wei-Xing; Stanley, H. Eugene

    2015-06-01

    When common factors strongly influence two power-law cross-correlated time series recorded in complex natural or social systems, using detrended cross-correlation analysis (DCCA) without considering these common factors will bias the results. We use detrended partial cross-correlation analysis (DPXA) to uncover the intrinsic power-law cross correlations between two simultaneously recorded time series in the presence of nonstationarity after removing the effects of other time series acting as common forces. The DPXA method is a generalization of the detrended cross-correlation analysis that takes into account partial correlation analysis. We demonstrate the method by using bivariate fractional Brownian motions contaminated with a fractional Brownian motion. We find that the DPXA is able to recover the analytical cross Hurst indices, and thus the multiscale DPXA coefficients are a viable alternative to the conventional cross-correlation coefficient. We demonstrate the advantage of the DPXA coefficients over the DCCA coefficients by analyzing contaminated bivariate fractional Brownian motions. We calculate the DPXA coefficients and use them to extract the intrinsic cross correlation between crude oil and gold futures by taking into consideration the impact of the U.S. dollar index. We develop the multifractal DPXA (MF-DPXA) method in order to generalize the DPXA method and investigate multifractal time series. We analyze multifractal binomial measures masked with strong white noises and find that the MF-DPXA method quantifies the hidden multifractal nature while the multifractal DCCA method fails.

  7. Operation States Analysis of the Series-Parallel resonant Converter Working Above Resonance Frequency

    Directory of Open Access Journals (Sweden)

    Peter Dzurko

    2007-01-01

    Full Text Available Operation states analysis of a series-parallel converter working above resonance frequency is described in the paper. Principal equations are derived for individual operation states. On the basis of them the diagrams are made out. The diagrams give the complex image of the converter behaviour for individual circuit parameters. The waveforms may be utilised at designing the inverter individual parts.

  8. AAMFT Master Series Tapes: An Analysis of the Inclusion of Feminist Principles into Family Therapy Practice.

    Science.gov (United States)

    Haddock, Shelley A.; MacPhee, David; Zimmerman, Toni Schindler

    2001-01-01

    Content analysis of 23 American Association for Marriage and Family Therapy Master Series tapes was used to determine how well feminist behaviors have been incorporated into ideal family therapy practice. Feminist behaviors were infrequent, being evident in fewer than 3% of time blocks in event sampling and 10 of 39 feminist behaviors of the…

  9. Dynamic factor analysis in the frequency domain: causal modeling of multivariate psychophysiological time series

    NARCIS (Netherlands)

    Molenaar, P.C.M.

    1987-01-01

    Outlines a frequency domain analysis of the dynamic factor model and proposes a solution to the problem of constructing a causal filter of lagged factor loadings. The method is illustrated with applications to simulated and real multivariate time series. The latter applications involve topographic

  10. Harmonic analysis of dense time series of landsat imagery for modeling change in forest conditions

    Science.gov (United States)

    Barry Tyler. Wilson

    2015-01-01

    This study examined the utility of dense time series of Landsat imagery for small area estimation and mapping of change in forest conditions over time. The study area was a region in north central Wisconsin for which Landsat 7 ETM+ imagery and field measurements from the Forest Inventory and Analysis program are available for the decade of 2003 to 2012. For the periods...

  11. Economic Conditions and the Divorce Rate: A Time-Series Analysis of the Postwar United States.

    Science.gov (United States)

    South, Scott J.

    1985-01-01

    Challenges the belief that the divorce rate rises during prosperity and falls during economic recessions. Time-series regression analysis of postwar United States reveals small but positive effects of unemployment on divorce rate. Stronger influences on divorce rates are changes in age structure and labor-force participation rate of women.…

  12. Operation Analysis of the Series-Parallel Resonant Converter Working above Resonance Frequency

    Directory of Open Access Journals (Sweden)

    Peter Dzurko

    2006-01-01

    Full Text Available The present article deals with theoretical analysis of operation of a series-parallel converter working above resonance frequency. Derived are principal equations for individual operation intervals. Based on these made out are waveforms of individual quantities during both the inverter operation at load and no-load operation. The waveforms may be utilised at designing the inverter individual parts.

  13. The application of value analysis techniques for complex problems

    International Nuclear Information System (INIS)

    Chiquelin, W.R.; Cossel, S.C.; De Jong, V.J.; Halverson, T.W.

    1986-01-01

    This paper discusses the application of the Value Analysis technique to the transuranic package transporter (TRUPACT). A team representing five different companies or organizations with diverse technical backgrounds was formed to analyze and recommend improvements. The results were a 38% systems-wide savings, if incorporated, and a shipping container which is volumetrically and payload efficient as well as user friendly. The Value Analysis technique is a proven tool widely used in many diverse areas both in the government and the private sector. Value Analysis uses functional diagramming of a piece of equipment or process to discretely identify every facet of the item being analyzed. A standard set of questions is then asked: What is it?, What does it do?, What does it cost?, What else will do the task?, and What would that cost? Using logic and a disciplined approach, the result of the Value Analysis performs the necessary functions at a high quality and the lowest overall cost

  14. A comparative analysis of soft computing techniques for gene prediction.

    Science.gov (United States)

    Goel, Neelam; Singh, Shailendra; Aseri, Trilok Chand

    2013-07-01

    The rapid growth of genomic sequence data for both human and nonhuman species has made analyzing these sequences, especially predicting genes in them, very important and is currently the focus of many research efforts. Beside its scientific interest in the molecular biology and genomics community, gene prediction is of considerable importance in human health and medicine. A variety of gene prediction techniques have been developed for eukaryotes over the past few years. This article reviews and analyzes the application of certain soft computing techniques in gene prediction. First, the problem of gene prediction and its challenges are described. These are followed by different soft computing techniques along with their application to gene prediction. In addition, a comparative analysis of different soft computing techniques for gene prediction is given. Finally some limitations of the current research activities and future research directions are provided. Copyright © 2013 Elsevier Inc. All rights reserved.

  15. Comparing dynamical systems concepts and techniques for biomechanical analysis

    OpenAIRE

    van Emmerik, Richard E.A.; Ducharme, Scott W.; Amado, Avelino C.; Hamill, Joseph

    2016-01-01

    Traditional biomechanical analyses of human movement are generally derived from linear mathematics. While these methods can be useful in many situations, they do not describe behaviors in human systems that are predominately nonlinear. For this reason, nonlinear analysis methods based on a dynamical systems approach have become more prevalent in recent literature. These analysis techniques have provided new insights into how systems (1) maintain pattern stability, (2) transition into new stat...

  16. Reliability Analysis Techniques for Communication Networks in Nuclear Power Plant

    International Nuclear Information System (INIS)

    Lim, T. J.; Jang, S. C.; Kang, H. G.; Kim, M. C.; Eom, H. S.; Lee, H. J.

    2006-09-01

    The objectives of this project is to investigate and study existing reliability analysis techniques for communication networks in order to develop reliability analysis models for nuclear power plant's safety-critical networks. It is necessary to make a comprehensive survey of current methodologies for communication network reliability. Major outputs of this study are design characteristics of safety-critical communication networks, efficient algorithms for quantifying reliability of communication networks, and preliminary models for assessing reliability of safety-critical communication networks

  17. Unified functional network and nonlinear time series analysis for complex systems science: The pyunicorn package

    Science.gov (United States)

    Donges, Jonathan; Heitzig, Jobst; Beronov, Boyan; Wiedermann, Marc; Runge, Jakob; Feng, Qing Yi; Tupikina, Liubov; Stolbova, Veronika; Donner, Reik; Marwan, Norbert; Dijkstra, Henk; Kurths, Jürgen

    2016-04-01

    We introduce the pyunicorn (Pythonic unified complex network and recurrence analysis toolbox) open source software package for applying and combining modern methods of data analysis and modeling from complex network theory and nonlinear time series analysis. pyunicorn is a fully object-oriented and easily parallelizable package written in the language Python. It allows for the construction of functional networks such as climate networks in climatology or functional brain networks in neuroscience representing the structure of statistical interrelationships in large data sets of time series and, subsequently, investigating this structure using advanced methods of complex network theory such as measures and models for spatial networks, networks of interacting networks, node-weighted statistics, or network surrogates. Additionally, pyunicorn provides insights into the nonlinear dynamics of complex systems as recorded in uni- and multivariate time series from a non-traditional perspective by means of recurrence quantification analysis, recurrence networks, visibility graphs, and construction of surrogate time series. The range of possible applications of the library is outlined, drawing on several examples mainly from the field of climatology. pyunicorn is available online at https://github.com/pik-copan/pyunicorn. Reference: J.F. Donges, J. Heitzig, B. Beronov, M. Wiedermann, J. Runge, Q.-Y. Feng, L. Tupikina, V. Stolbova, R.V. Donner, N. Marwan, H.A. Dijkstra, and J. Kurths, Unified functional network and nonlinear time series analysis for complex systems science: The pyunicorn package, Chaos 25, 113101 (2015), DOI: 10.1063/1.4934554, Preprint: arxiv.org:1507.01571 [physics.data-an].

  18. Analytical techniques for wine analysis: An African perspective; a review

    International Nuclear Information System (INIS)

    Villiers, André de; Alberts, Phillipus; Tredoux, Andreas G.J.; Nieuwoudt, Hélène H.

    2012-01-01

    Highlights: ► Analytical techniques developed for grape and wine analysis in Africa are reviewed. ► The utility of infrared spectroscopic methods is demonstrated. ► An overview of separation of wine constituents by GC, HPLC, CE is presented. ► Novel LC and GC sample preparation methods for LC and GC are presented. ► Emerging methods for grape and wine analysis in Africa are discussed. - Abstract: Analytical chemistry is playing an ever-increasingly important role in the global wine industry. Chemical analysis of wine is essential in ensuring product safety and conformity to regulatory laws governing the international market, as well as understanding the fundamental aspects of grape and wine production to improve manufacturing processes. Within this field, advanced instrumental analysis methods have been exploited more extensively in recent years. Important advances in instrumental analytical techniques have also found application in the wine industry. This review aims to highlight the most important developments in the field of instrumental wine and grape analysis in the African context. The focus of this overview is specifically on the application of advanced instrumental techniques, including spectroscopic and chromatographic methods. Recent developments in wine and grape analysis and their application in the African context are highlighted, and future trends are discussed in terms of their potential contribution to the industry.

  19. Analytical techniques for wine analysis: An African perspective; a review

    Energy Technology Data Exchange (ETDEWEB)

    Villiers, Andre de, E-mail: ajdevill@sun.ac.za [Department of Chemistry and Polymer Science, Stellenbosch University, Private Bag X1, Matieland 7602, Stellenbosch (South Africa); Alberts, Phillipus [Department of Chemistry and Polymer Science, Stellenbosch University, Private Bag X1, Matieland 7602, Stellenbosch (South Africa); Tredoux, Andreas G.J.; Nieuwoudt, Helene H. [Institute for Wine Biotechnology, Department of Viticulture and Oenology, Stellenbosch University, Private Bag X1, Matieland 7602, Stellenbosch (South Africa)

    2012-06-12

    Highlights: Black-Right-Pointing-Pointer Analytical techniques developed for grape and wine analysis in Africa are reviewed. Black-Right-Pointing-Pointer The utility of infrared spectroscopic methods is demonstrated. Black-Right-Pointing-Pointer An overview of separation of wine constituents by GC, HPLC, CE is presented. Black-Right-Pointing-Pointer Novel LC and GC sample preparation methods for LC and GC are presented. Black-Right-Pointing-Pointer Emerging methods for grape and wine analysis in Africa are discussed. - Abstract: Analytical chemistry is playing an ever-increasingly important role in the global wine industry. Chemical analysis of wine is essential in ensuring product safety and conformity to regulatory laws governing the international market, as well as understanding the fundamental aspects of grape and wine production to improve manufacturing processes. Within this field, advanced instrumental analysis methods have been exploited more extensively in recent years. Important advances in instrumental analytical techniques have also found application in the wine industry. This review aims to highlight the most important developments in the field of instrumental wine and grape analysis in the African context. The focus of this overview is specifically on the application of advanced instrumental techniques, including spectroscopic and chromatographic methods. Recent developments in wine and grape analysis and their application in the African context are highlighted, and future trends are discussed in terms of their potential contribution to the industry.

  20. Evolution of the sedimentation technique for particle size distribution analysis

    International Nuclear Information System (INIS)

    Maley, R.

    1998-01-01

    After an introduction on the significance of particle size measurements, sedimentation methods are described, with emphasis on the evolution of the gravitational approach. The gravitational technique based on mass determination by X-ray adsorption allows fast analysis by automation and easy data handling, in addition to providing the accuracy required by quality control and research applications [it

  1. Comparative Analysis of Some Techniques in the Biological ...

    African Journals Online (AJOL)

    The experiments involved the simulation of conditions of a major spill by pouring crude oil on the cells from perforated cans and the in-situ bioremediation of the polluted soils using the techniques that consisted in the manipulation of different variables within the soil environment. The analysis of soil characteristics after a ...

  2. Tailored Cloze: Improved with Classical Item Analysis Techniques.

    Science.gov (United States)

    Brown, James Dean

    1988-01-01

    The reliability and validity of a cloze procedure used as an English-as-a-second-language (ESL) test in China were improved by applying traditional item analysis and selection techniques. The 'best' test items were chosen on the basis of item facility and discrimination indices, and were administered as a 'tailored cloze.' 29 references listed.…

  3. Comparison of the Performance of Two Advanced Spectral Methods for the Analysis of Times Series in Paleoceanography

    Directory of Open Access Journals (Sweden)

    Eulogio Pardo-Igúzquiza

    2015-08-01

    Full Text Available Many studies have revealed the cyclicity of past ocean/atmosphere dynamics at a wide range of time scales (from decadal to millennial time scales, based on the spectral analysis of time series of climate proxies obtained from deep sea sediment cores. Among the many techniques available for spectral analysis, the maximum entropy method and the Thomson multitaper approach have frequently been used because of their good statistical properties and high resolution with short time series. The novelty of the present study is that we compared the two methods by according to the performance of their statistical tests to assess the statistical significance of their power spectrum estimates. The statistical significance of maximum entropy estimates was assessed by a random permutation test (Pardo-Igúzquiza and Rodríguez-Tovar, 2000, while the statistical significance of the Thomson multitaper method was assessed by an F-test (Thomson, 1982. We compared the results obtained in a case study using simulated data where the spectral content of the time series was known and in a case study with real data. In both cases the results are similar: while the cycles identified as significant by maximum entropy and the permutation test have a clear physical interpretation, the F-test with the Thomson multitaper estimator tends to find as no significant the peaks in the low frequencies and tends to give as significant more spurious peaks in the middle and high frequencies. Nevertheless, the best strategy is to use both techniques and to use the advantages of each of them.

  4. INTERNAL LIMITING MEMBRANE PEELING VERSUS INVERTED FLAP TECHNIQUE FOR TREATMENT OF FULL-THICKNESS MACULAR HOLES: A COMPARATIVE STUDY IN A LARGE SERIES OF PATIENTS.

    Science.gov (United States)

    Rizzo, Stanislao; Tartaro, Ruggero; Barca, Francesco; Caporossi, Tomaso; Bacherini, Daniela; Giansanti, Fabrizio

    2017-12-08

    The inverted flap (IF) technique has recently been introduced in macular hole (MH) surgery. The IF technique has shown an increase of the success rate in the case of large MHs and in MHs associated with high myopia. This study reports the anatomical and functional results in a large series of patients affected by MH treated using pars plana vitrectomy and gas tamponade combined with internal limiting membrane (ILM) peeling or IF. This is a retrospective, consecutive, nonrandomized comparative study of patients affected by idiopathic or myopic MH treated using small-gauge pars plana vitrectomy (25- or 23-gauge) between January 2011 and May 2016. The patients were divided into two groups according to the ILM removal technique (complete removal vs. IF). A subgroup analysis was performed according to the MH diameter (MH peeling and 320 patients underwent pars plana vitrectomy and IF. Overall, 84.94% of the patients had complete anatomical success characterized by MH closure after the operation. In particular, among the patients who underwent only ILM peeling the closure rate was 78.75%; among the patients who underwent the IF technique, it was 91.93% (P = 0.001); and among the patients affected by full-thickness MH ≥400 µm, success was achieved in 95.6% of the cases in the IF group and in 78.6% in the ILM peeling group (P = 0.001); among the patients with an axial length ≥26 mm, success was achieved in 88.4% of the cases in the IF group and in 38.9% in the ILM peeling group (P = 0.001). Average preoperative best-corrected visual acuity was 0.77 (SD = 0.32) logarithm of the minimum angle of resolution (20/118 Snellen) in the peeling group and 0.74 (SD = 0.33) logarithm of the minimum angle of resolution (20/110 Snellen) in the IF group (P = 0.31). Mean postoperative best-corrected visual acuity was 0.52 (SD = 0.42) logarithm of the minimum angle of resolution (20/66 Snellen) in the peeling group and 0.43 (SD = 0.31) logarithm of the minimum angle of resolution (20

  5. Detection and Characterization of Ground Displacement Sources from Variational Bayesian Independent Component Analysis of GPS Time Series

    Science.gov (United States)

    Gualandi, A.; Serpelloni, E.; Belardinelli, M. E.

    2014-12-01

    A critical point in the analysis of ground displacements time series is the development of data driven methods that allow to discern and characterize the different sources that generate the observed displacements. A widely used multivariate statistical technique is the Principal Component Analysis (PCA), which allows to reduce the dimensionality of the data space maintaining most of the variance of the dataset explained. It reproduces the original data using a limited number of Principal Components, but it also shows some deficiencies. Indeed, PCA does not perform well in finding the solution to the so-called Blind Source Separation (BSS) problem, i.e. in recovering and separating the original sources that generated the observed data. This is mainly due to the assumptions on which PCA relies: it looks for a new Euclidean space where the projected data are uncorrelated. Usually, the uncorrelation condition is not strong enough and it has been proven that the BSS problem can be tackled imposing on the components to be independent. The Independent Component Analysis (ICA) is, in fact, another popular technique adopted to approach this problem, and it can be used in all those fields where PCA is also applied. An ICA approach enables us to explain the time series imposing a fewer number of constraints on the model, and to reveal anomalies in the data such as transient signals. However, the independence condition is not easy to impose, and it is often necessary to introduce some approximations. To work around this problem, we use a variational bayesian ICA (vbICA) method, which models the probability density function (pdf) of each source signal using a mix of Gaussian distributions. This technique allows for more flexibility in the description of the pdf of the sources, giving a more reliable estimate of them. Here we present the application of the vbICA technique to GPS position time series. First, we use vbICA on synthetic data that simulate a seismic cycle

  6. Spectroscopic analysis technique for arc-welding process control

    Science.gov (United States)

    Mirapeix, Jesús; Cobo, Adolfo; Conde, Olga; Quintela, María Ángeles; López-Higuera, José-Miguel

    2005-09-01

    The spectroscopic analysis of the light emitted by thermal plasmas has found many applications, from chemical analysis to monitoring and control of industrial processes. Particularly, it has been demonstrated that the analysis of the thermal plasma generated during arc or laser welding can supply information about the process and, thus, about the quality of the weld. In some critical applications (e.g. the aerospace sector), an early, real-time detection of defects in the weld seam (oxidation, porosity, lack of penetration, ...) is highly desirable as it can reduce expensive non-destructive testing (NDT). Among others techniques, full spectroscopic analysis of the plasma emission is known to offer rich information about the process itself, but it is also very demanding in terms of real-time implementations. In this paper, we proposed a technique for the analysis of the plasma emission spectrum that is able to detect, in real-time, changes in the process parameters that could lead to the formation of defects in the weld seam. It is based on the estimation of the electronic temperature of the plasma through the analysis of the emission peaks from multiple atomic species. Unlike traditional techniques, which usually involve peak fitting to Voigt functions using the Levenberg-Marquardt recursive method, we employ the LPO (Linear Phase Operator) sub-pixel algorithm to accurately estimate the central wavelength of the peaks (allowing an automatic identification of each atomic species) and cubic-spline interpolation of the noisy data to obtain the intensity and width of the peaks. Experimental tests on TIG-welding using fiber-optic capture of light and a low-cost CCD-based spectrometer, show that some typical defects can be easily detected and identified with this technique, whose typical processing time for multiple peak analysis is less than 20msec. running in a conventional PC.

  7. Coastal city subsidence in Shenzhen (China), monitored using multi-frequency radar interferometry time-series techniques

    Science.gov (United States)

    Liu, Peng; Li, Yongsheng; Singleton, Andrew; Li, Qingquan; Zhang, Jingfa; Li, Zhenhong

    2014-05-01

    In just 26 years, the coastal city of Shenzhen (Southern China) has been transformed from a small fishing village to a modern city with a population exceeding 8.5 million people. Following its designation as a Special Economic Zone in the 1980s, the city became a test bed for China's economic reforms and currently leads many new practices in urban planning. The rapid economic development was matched by a sharp increase in the demand for usable land and consequently, extensive coastal reclamation has been undertaken by piling rock fragments from nearby hills onto the seabed. However, it has recently been reported that new apartments, offices and transport networks built on the reclaimed land have become unusable due to ground subsidence. The additional threat of coastal inundation from sea-level rise also requires serious consideration. InSAR time-series techniques (such as Persistent Scatterer and Small Baseline InSAR) are capable of detecting sub-centimetre elevation changes of the Earth's surface over large areas and at a density far exceeding the capabilities of a GPS network - particularly for such an urban environment as Shenzhen. This study uses numerous independent tracks of SAR data (two ENVISAT C-band tracks and two ALOS L-band tracks) to determine the surface movements between 2004 and 2013. Quantitative comparative analyses are carried out in the overlapping area between two adjacent tracks, and thus no ground data is required to validate InSAR results. The results show greatest subsidence in coastal areas with the areas of reclaimed land also predominantly undergoing subsidence. The combination of different ascending and descending tracks allows 2D velocity fields to be estimated and it will be important to determine whether the subsidence from the recently reclaimed land is consolidation or part of a longer-term trend. This ability to provide accurate measurements of ground stability for the city of Shenzhen will help focus investigations into areas of

  8. Time Series Imputation via L1 Norm-Based Singular Spectrum Analysis

    Science.gov (United States)

    Kalantari, Mahdi; Yarmohammadi, Masoud; Hassani, Hossein; Silva, Emmanuel Sirimal

    Missing values in time series data is a well-known and important problem which many researchers have studied extensively in various fields. In this paper, a new nonparametric approach for missing value imputation in time series is proposed. The main novelty of this research is applying the L1 norm-based version of Singular Spectrum Analysis (SSA), namely L1-SSA which is robust against outliers. The performance of the new imputation method has been compared with many other established methods. The comparison is done by applying them to various real and simulated time series. The obtained results confirm that the SSA-based methods, especially L1-SSA can provide better imputation in comparison to other methods.

  9. Analysis of Land Subsidence Monitoring in Mining Area with Time-Series Insar Technology

    Science.gov (United States)

    Sun, N.; Wang, Y. J.

    2018-04-01

    Time-series InSAR technology has become a popular land subsidence monitoring method in recent years, because of its advantages such as high accuracy, wide area, low expenditure, intensive monitoring points and free from accessibility restrictions. In this paper, we applied two kinds of satellite data, ALOS PALSAR and RADARSAT-2, to get the subsidence monitoring results of the study area in two time periods by time-series InSAR technology. By analyzing the deformation range, rate and amount, the time-series analysis of land subsidence in mining area was realized. The results show that InSAR technology could be used to monitor land subsidence in large area and meet the demand of subsidence monitoring in mining area.

  10. Determination and identification of naturally occurring decay series using milli-second order pulse time interval analysis (TIA)

    International Nuclear Information System (INIS)

    Hashimoto, T.; Sanada, Y.; Uezu, Y.

    2003-01-01

    A delayed coincidence method, called a time interval analysis (TIA) method, has been successfully applied to selective determination of the correlated α-α decay events in millisecond order life-time. A main decay process applicable to TIA-treatment is 220 Rn → 216 Po(T 1/2 :145ms) → {Th-series}. The TIA is fundamentally based on the difference of time interval distribution between non-correlated decay events and other events such as background or random events when they were compiled the time interval data within a fixed time (for example, a tenth of concerned half lives). The sensitivity of the TIA-analysis due to correlated α-α decay events could be subsequently improved in respect of background elimination using the pulse shape discrimination technique (PSD with PERALS counter) to reject β/γ-pulses, purging of nitrogen gas into extra scintillator, and applying solvent extraction of Ra. (author)

  11. Study of analysis techniques of thermoluminescent dosimeters response

    International Nuclear Information System (INIS)

    Castro, Walber Amorim

    2002-01-01

    The Personal Monitoring Service of the Centro Regional de Ciencias Nucleares uses in its dosemeter the TLD 700 material . The TLD's analysis is carried out using a Harshaw-Bicron model 6600 automatic reading system. This system uses dry air instead of the traditional gaseous nitrogen. This innovation brought advantages to the service but introduced uncertainties in the reference of the detectors; one of these was observed for doses below 0,5 mSv. In this work different techniques of analysis of the TLD response were investigated and compared, involving dose values in this interval. These techniques include thermal pre-treatment, and different kinds of the glow curves analysis methods were investigated. Obtained results showed the necessity of developing a specific software that permits the automatic background subtraction for the glow curves for each dosemeter . This software was developed and it bean tested. Preliminary results showed the software increase the response reproducibility. (author)

  12. Sensitivity analysis technique for application to deterministic models

    International Nuclear Information System (INIS)

    Ishigami, T.; Cazzoli, E.; Khatib-Rahbar, M.; Unwin, S.D.

    1987-01-01

    The characterization of sever accident source terms for light water reactors should include consideration of uncertainties. An important element of any uncertainty analysis is an evaluation of the sensitivity of the output probability distributions reflecting source term uncertainties to assumptions regarding the input probability distributions. Historically, response surface methods (RSMs) were developed to replace physical models using, for example, regression techniques, with simplified models for example, regression techniques, with simplified models for extensive calculations. The purpose of this paper is to present a new method for sensitivity analysis that does not utilize RSM, but instead relies directly on the results obtained from the original computer code calculations. The merits of this approach are demonstrated by application of the proposed method to the suppression pool aerosol removal code (SPARC), and the results are compared with those obtained by sensitivity analysis with (a) the code itself, (b) a regression model, and (c) Iman's method

  13. Multivariate Analysis Techniques for Optimal Vision System Design

    DEFF Research Database (Denmark)

    Sharifzadeh, Sara

    The present thesis considers optimization of the spectral vision systems used for quality inspection of food items. The relationship between food quality, vision based techniques and spectral signature are described. The vision instruments for food analysis as well as datasets of the food items...... used in this thesis are described. The methodological strategies are outlined including sparse regression and pre-processing based on feature selection and extraction methods, supervised versus unsupervised analysis and linear versus non-linear approaches. One supervised feature selection algorithm...... (SSPCA) and DCT based characterization of the spectral diffused reflectance images for wavelength selection and discrimination. These methods together with some other state-of-the-art statistical and mathematical analysis techniques are applied on datasets of different food items; meat, diaries, fruits...

  14. Multiple predictor smoothing methods for sensitivity analysis: Description of techniques

    International Nuclear Information System (INIS)

    Storlie, Curtis B.; Helton, Jon C.

    2008-01-01

    The use of multiple predictor smoothing methods in sampling-based sensitivity analyses of complex models is investigated. Specifically, sensitivity analysis procedures based on smoothing methods employing the stepwise application of the following nonparametric regression techniques are described: (i) locally weighted regression (LOESS), (ii) additive models, (iii) projection pursuit regression, and (iv) recursive partitioning regression. Then, in the second and concluding part of this presentation, the indicated procedures are illustrated with both simple test problems and results from a performance assessment for a radioactive waste disposal facility (i.e., the Waste Isolation Pilot Plant). As shown by the example illustrations, the use of smoothing procedures based on nonparametric regression techniques can yield more informative sensitivity analysis results than can be obtained with more traditional sensitivity analysis procedures based on linear regression, rank regression or quadratic regression when nonlinear relationships between model inputs and model predictions are present

  15. Sensitivity analysis of machine-learning models of hydrologic time series

    Science.gov (United States)

    O'Reilly, A. M.

    2017-12-01

    Sensitivity analysis traditionally has been applied to assessing model response to perturbations in model parameters, where the parameters are those model input variables adjusted during calibration. Unlike physics-based models where parameters represent real phenomena, the equivalent of parameters for machine-learning models are simply mathematical "knobs" that are automatically adjusted during training/testing/verification procedures. Thus the challenge of extracting knowledge of hydrologic system functionality from machine-learning models lies in their very nature, leading to the label "black box." Sensitivity analysis of the forcing-response behavior of machine-learning models, however, can provide understanding of how the physical phenomena represented by model inputs affect the physical phenomena represented by model outputs.As part of a previous study, hybrid spectral-decomposition artificial neural network (ANN) models were developed to simulate the observed behavior of hydrologic response contained in multidecadal datasets of lake water level, groundwater level, and spring flow. Model inputs used moving window averages (MWA) to represent various frequencies and frequency-band components of time series of rainfall and groundwater use. Using these forcing time series, the MWA-ANN models were trained to predict time series of lake water level, groundwater level, and spring flow at 51 sites in central Florida, USA. A time series of sensitivities for each MWA-ANN model was produced by perturbing forcing time-series and computing the change in response time-series per unit change in perturbation. Variations in forcing-response sensitivities are evident between types (lake, groundwater level, or spring), spatially (among sites of the same type), and temporally. Two generally common characteristics among sites are more uniform sensitivities to rainfall over time and notable increases in sensitivities to groundwater usage during significant drought periods.

  16. Application of the Allan Variance to Time Series Analysis in Astrometry and Geodesy: A Review.

    Science.gov (United States)

    Malkin, Zinovy

    2016-04-01

    The Allan variance (AVAR) was introduced 50 years ago as a statistical tool for assessing the frequency standards deviations. For the past decades, AVAR has increasingly been used in geodesy and astrometry to assess the noise characteristics in geodetic and astrometric time series. A specific feature of astrometric and geodetic measurements, as compared with clock measurements, is that they are generally associated with uncertainties; thus, an appropriate weighting should be applied during data analysis. In addition, some physically connected scalar time series naturally form series of multidimensional vectors. For example, three station coordinates time series X, Y, and Z can be combined to analyze 3-D station position variations. The classical AVAR is not intended for processing unevenly weighted and/or multidimensional data. Therefore, AVAR modifications, namely weighted AVAR (WAVAR), multidimensional AVAR (MAVAR), and weighted multidimensional AVAR (WMAVAR), were introduced to overcome these deficiencies. In this paper, a brief review is given of the experience of using AVAR and its modifications in processing astrogeodetic time series.

  17. Principal components and iterative regression analysis of geophysical series: Application to Sunspot number (1750 2004)

    Science.gov (United States)

    Nordemann, D. J. R.; Rigozo, N. R.; de Souza Echer, M. P.; Echer, E.

    2008-11-01

    We present here an implementation of a least squares iterative regression method applied to the sine functions embedded in the principal components extracted from geophysical time series. This method seems to represent a useful improvement for the non-stationary time series periodicity quantitative analysis. The principal components determination followed by the least squares iterative regression method was implemented in an algorithm written in the Scilab (2006) language. The main result of the method is to obtain the set of sine functions embedded in the series analyzed in decreasing order of significance, from the most important ones, likely to represent the physical processes involved in the generation of the series, to the less important ones that represent noise components. Taking into account the need of a deeper knowledge of the Sun's past history and its implication to global climate change, the method was applied to the Sunspot Number series (1750-2004). With the threshold and parameter values used here, the application of the method leads to a total of 441 explicit sine functions, among which 65 were considered as being significant and were used for a reconstruction that gave a normalized mean squared error of 0.146.

  18. Interrupted time series analysis in drug utilization research is increasing: systematic review and recommendations.

    Science.gov (United States)

    Jandoc, Racquel; Burden, Andrea M; Mamdani, Muhammad; Lévesque, Linda E; Cadarette, Suzanne M

    2015-08-01

    To describe the use and reporting of interrupted time series methods in drug utilization research. We completed a systematic search of MEDLINE, Web of Science, and reference lists to identify English language articles through to December 2013 that used interrupted time series methods in drug utilization research. We tabulated the number of studies by publication year and summarized methodological detail. We identified 220 eligible empirical applications since 1984. Only 17 (8%) were published before 2000, and 90 (41%) were published since 2010. Segmented regression was the most commonly applied interrupted time series method (67%). Most studies assessed drug policy changes (51%, n = 112); 22% (n = 48) examined the impact of new evidence, 18% (n = 39) examined safety advisories, and 16% (n = 35) examined quality improvement interventions. Autocorrelation was considered in 66% of studies, 31% reported adjusting for seasonality, and 15% accounted for nonstationarity. Use of interrupted time series methods in drug utilization research has increased, particularly in recent years. Despite methodological recommendations, there is large variation in reporting of analytic methods. Developing methodological and reporting standards for interrupted time series analysis is important to improve its application in drug utilization research, and we provide recommendations for consideration. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  19. BiGGEsTS: integrated environment for biclustering analysis of time series gene expression data

    Directory of Open Access Journals (Sweden)

    Madeira Sara C

    2009-07-01

    Full Text Available Abstract Background The ability to monitor changes in expression patterns over time, and to observe the emergence of coherent temporal responses using expression time series, is critical to advance our understanding of complex biological processes. Biclustering has been recognized as an effective method for discovering local temporal expression patterns and unraveling potential regulatory mechanisms. The general biclustering problem is NP-hard. In the case of time series this problem is tractable, and efficient algorithms can be used. However, there is still a need for specialized applications able to take advantage of the temporal properties inherent to expression time series, both from a computational and a biological perspective. Findings BiGGEsTS makes available state-of-the-art biclustering algorithms for analyzing expression time series. Gene Ontology (GO annotations are used to assess the biological relevance of the biclusters. Methods for preprocessing expression time series and post-processing results are also included. The analysis is additionally supported by a visualization module capable of displaying informative representations of the data, including heatmaps, dendrograms, expression charts and graphs of enriched GO terms. Conclusion BiGGEsTS is a free open source graphical software tool for revealing local coexpression of genes in specific intervals of time, while integrating meaningful information on gene annotations. It is freely available at: http://kdbio.inesc-id.pt/software/biggests. We present a case study on the discovery of transcriptional regulatory modules in the response of Saccharomyces cerevisiae to heat stress.

  20. Definition of distance for nonlinear time series analysis of marked point process data

    Energy Technology Data Exchange (ETDEWEB)

    Iwayama, Koji, E-mail: koji@sat.t.u-tokyo.ac.jp [Research Institute for Food and Agriculture, Ryukoku Univeristy, 1-5 Yokotani, Seta Oe-cho, Otsu-Shi, Shiga 520-2194 (Japan); Hirata, Yoshito; Aihara, Kazuyuki [Institute of Industrial Science, The University of Tokyo, 4-6-1 Komaba, Meguro-ku, Tokyo 153-8505 (Japan)

    2017-01-30

    Marked point process data are time series of discrete events accompanied with some values, such as economic trades, earthquakes, and lightnings. A distance for marked point process data allows us to apply nonlinear time series analysis to such data. We propose a distance for marked point process data which can be calculated much faster than the existing distance when the number of marks is small. Furthermore, under some assumptions, the Kullback–Leibler divergences between posterior distributions for neighbors defined by this distance are small. We performed some numerical simulations showing that analysis based on the proposed distance is effective. - Highlights: • A new distance for marked point process data is proposed. • The distance can be computed fast enough for a small number of marks. • The method to optimize parameter values of the distance is also proposed. • Numerical simulations indicate that the analysis based on the distance is effective.

  1. DATA ANALYSIS TECHNIQUES IN SERVICE QUALITY LITERATURE: ESSENTIALS AND ADVANCES

    Directory of Open Access Journals (Sweden)

    Mohammed naved Khan

    2013-05-01

    Full Text Available Academic and business researchers have for long debated on the most appropriate data analysis techniques that can be employed in conducting empirical researches in the domain of services marketing. On the basis of an exhaustive review of literature, the present paper attempts to provide a concise and schematic portrayal of generally followed data analysis techniques in the field of services quality literature. Collectively, the extant literature suggests that there is a growing trend among researchers to rely on higher order multivariate techniques viz. confirmatory factor analysis, structural equation modeling etc. to generate and analyze complex models, while at times ignoring very basic and yet powerful procedures such as mean, t-Test, ANOVA and correlation. The marked shift in orientation of researchers towards using sophisticated analytical techniques can largely beattributed to the competition within the community of researchers in social sciences in general and those working in the area of service quality in particular as also growing demands of reviewers ofjournals. From a pragmatic viewpoint, it is expected that the paper will serve as a useful source of information and provide deeper insights to academic researchers, consultants, and practitionersinterested in modelling patterns of service quality and arriving at optimal solutions to increasingly complex management problems.

  2. Practical applications of activation analysis and other nuclear techniques

    International Nuclear Information System (INIS)

    Lyon, W.S.

    1982-01-01

    Neeutron activation analysis (NAA) is a versatile, sensitive multielement, usually nondestructive analytical technique used to determine elemental concentrations in a variety of materials. Samples are irradiated with neutrons in a nuclear reactor, removed, and for the nondestructive technique, the induced radioactivity measured. This measurement of γ rays emitted from specific radionuclides makes possible the quantitative determination of elements present. The method is described, advantages and disadvantages listed and a number of examples of its use given. Two other nuclear methods, particle induced x-ray emission and synchrotron produced x-ray fluorescence are also briefly discussed

  3. Maximum entropy technique in the doublet structure analysis

    International Nuclear Information System (INIS)

    Belashev, B.Z.; Panebrattsev, Yu.A.; Shakhaliev, Eh.I.; Soroko, L.M.

    1998-01-01

    The Maximum Entropy Technique (MENT) for solution of the inverse problems is explained. The effective computer program for resolution of the nonlinear equations system encountered in the MENT has been developed and tested. The possibilities of the MENT have been demonstrated on the example of the MENT in the doublet structure analysis of noisy experimental data. The comparison of the MENT results with results of the Fourier algorithm technique without regularization is presented. The tolerant noise level is equal to 30% for MENT and only 0.1% for the Fourier algorithm

  4. 17th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2016)

    International Nuclear Information System (INIS)

    2016-01-01

    Preface The 2016 version of the International Workshop on Advanced Computing and Analysis Techniques in Physics Research took place on January 18-22, 2016, at the Universidad Técnica Federico Santa Maria -UTFSM- in Valparaiso, Chile. The present volume of IOP Conference Series is devoted to the selected scientific contributions presented at the workshop. In order to guarantee the scientific quality of the Proceedings all papers were thoroughly peer-reviewed by an ad-hoc Editorial Committee with the help of many careful reviewers. The ACAT Workshop series has a long tradition starting in 1990 (Lyon, France), and takes place in intervals of a year and a half. Formerly these workshops were known under the name AIHENP (Artificial Intelligence for High Energy and Nuclear Physics). Each edition brings together experimental and theoretical physicists and computer scientists/experts, from particle and nuclear physics, astronomy and astrophysics in order to exchange knowledge and experience in computing and data analysis in physics. Three tracks cover the main topics: Computing technology: languages and system architectures. Data analysis: algorithms and tools. Theoretical Physics: techniques and methods. Although most contributions and discussions are related to particle physics and computing, other fields like condensed matter physics, earth physics, biophysics are often addressed in the hope to share our approaches and visions. It created a forum for exchanging ideas among fields, exploring and promoting cutting-edge computing technologies and debating hot topics. (paper)

  5. PREFACE: 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT2013)

    Science.gov (United States)

    Wang, Jianxiong

    2014-06-01

    This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2013) which took place on 16-21 May 2013 at the Institute of High Energy Physics, Chinese Academy of Sciences, Beijing, China. The workshop series brings together computer science researchers and practitioners, and researchers from particle physics and related fields to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. This year's edition of the workshop brought together over 120 participants from all over the world. 18 invited speakers presented key topics on the universe in computer, Computing in Earth Sciences, multivariate data analysis, automated computation in Quantum Field Theory as well as computing and data analysis challenges in many fields. Over 70 other talks and posters presented state-of-the-art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. The round table discussions on open-source, knowledge sharing and scientific collaboration stimulate us to think over the issue in the respective areas. ACAT 2013 was generously sponsored by the Chinese Academy of Sciences (CAS), National Natural Science Foundation of China (NFSC), Brookhaven National Laboratory in the USA (BNL), Peking University (PKU), Theoretical Physics Cernter for Science facilities of CAS (TPCSF-CAS) and Sugon. We would like to thank all the participants for their scientific contributions and for the en- thusiastic participation in all its activities of the workshop. Further information on ACAT 2013 can be found at http://acat2013.ihep.ac.cn. Professor Jianxiong Wang Institute of High Energy Physics Chinese Academy of Science Details of committees and sponsors are available in the PDF

  6. Application of Time Series Analysis in Determination of Lag Time in Jahanbin Basin

    Directory of Open Access Journals (Sweden)

    Seied Yahya Mirzaee

    2005-11-01

        One of the important issues that have significant role in study of hydrology of basin is determination of lag time. Lag time has significant role in hydrological studies. Quantity of rainfall related lag time depends on several factors, such as permeability, vegetation cover, catchments slope, rainfall intensity, storm duration and type of rain. Determination of lag time is important parameter in many projects such as dam design and also water resource studies. Lag time of basin could be calculated using various methods. One of these methods is time series analysis of spectral density. The analysis is based on fouries series. The time series is approximated with Sinuous and Cosines functions. In this method harmonically significant quantities with individual frequencies are presented. Spectral density under multiple time series could be used to obtain basin lag time for annual runoff and short-term rainfall fluctuation. A long lag time could be due to snowmelt as well as melting ice due to rainfalls in freezing days. In this research the lag time of Jahanbin basin has been determined using spectral density method. The catchments is subjected to both rainfall and snowfall. For short term rainfall fluctuation with a return period  2, 3, 4 months, the lag times were found 0.18, 0.5 and 0.083 month, respectively.

  7. Statistical tools for analysis and modeling of cosmic populations and astronomical time series: CUDAHM and TSE

    Science.gov (United States)

    Loredo, Thomas; Budavari, Tamas; Scargle, Jeffrey D.

    2018-01-01

    This presentation provides an overview of open-source software packages addressing two challenging classes of astrostatistics problems. (1) CUDAHM is a C++ framework for hierarchical Bayesian modeling of cosmic populations, leveraging graphics processing units (GPUs) to enable applying this computationally challenging paradigm to large datasets. CUDAHM is motivated by measurement error problems in astronomy, where density estimation and linear and nonlinear regression must be addressed for populations of thousands to millions of objects whose features are measured with possibly complex uncertainties, potentially including selection effects. An example calculation demonstrates accurate GPU-accelerated luminosity function estimation for simulated populations of $10^6$ objects in about two hours using a single NVIDIA Tesla K40c GPU. (2) Time Series Explorer (TSE) is a collection of software in Python and MATLAB for exploratory analysis and statistical modeling of astronomical time series. It comprises a library of stand-alone functions and classes, as well as an application environment for interactive exploration of times series data. The presentation will summarize key capabilities of this emerging project, including new algorithms for analysis of irregularly-sampled time series.

  8. Nuclear techniques of analysis in diamond synthesis and annealing

    Energy Technology Data Exchange (ETDEWEB)

    Jamieson, D. N.; Prawer, S.; Gonon, P.; Walker, R.; Dooley, S.; Bettiol, A.; Pearce, J. [Melbourne Univ., Parkville, VIC (Australia). School of Physics

    1996-12-31

    Nuclear techniques of analysis have played an important role in the study of synthetic and laser annealed diamond. These measurements have mainly used ion beam analysis with a focused MeV ion beam in a nuclear microprobe system. A variety of techniques have been employed. One of the most important is nuclear elastic scattering, sometimes called non-Rutherford scattering, which has been used to accurately characterise diamond films for thickness and composition. This is possible by the use of a database of measured scattering cross sections. Recently, this work has been extended and nuclear elastic scattering cross sections for both natural boron isotopes have been measured. For radiation damaged diamond, a focused laser annealing scheme has been developed which produces near complete regrowth of MeV phosphorus implanted diamonds. In the laser annealed regions, proton induced x-ray emission has been used to show that 50 % of the P atoms occupy lattice sites. This opens the way to produce n-type diamond for microelectronic device applications. All these analytical applications utilize a focused MeV microbeam which is ideally suited for diamond analysis. This presentation reviews these applications, as well as the technology of nuclear techniques of analysis for diamond with a focused beam. 9 refs., 6 figs.

  9. Nuclear techniques of analysis in diamond synthesis and annealing

    Energy Technology Data Exchange (ETDEWEB)

    Jamieson, D N; Prawer, S; Gonon, P; Walker, R; Dooley, S; Bettiol, A; Pearce, J [Melbourne Univ., Parkville, VIC (Australia). School of Physics

    1997-12-31

    Nuclear techniques of analysis have played an important role in the study of synthetic and laser annealed diamond. These measurements have mainly used ion beam analysis with a focused MeV ion beam in a nuclear microprobe system. A variety of techniques have been employed. One of the most important is nuclear elastic scattering, sometimes called non-Rutherford scattering, which has been used to accurately characterise diamond films for thickness and composition. This is possible by the use of a database of measured scattering cross sections. Recently, this work has been extended and nuclear elastic scattering cross sections for both natural boron isotopes have been measured. For radiation damaged diamond, a focused laser annealing scheme has been developed which produces near complete regrowth of MeV phosphorus implanted diamonds. In the laser annealed regions, proton induced x-ray emission has been used to show that 50 % of the P atoms occupy lattice sites. This opens the way to produce n-type diamond for microelectronic device applications. All these analytical applications utilize a focused MeV microbeam which is ideally suited for diamond analysis. This presentation reviews these applications, as well as the technology of nuclear techniques of analysis for diamond with a focused beam. 9 refs., 6 figs.

  10. Development of fault diagnostic technique using reactor noise analysis

    International Nuclear Information System (INIS)

    Park, Jin Ho; Kim, J. S.; Oh, I. S.; Ryu, J. S.; Joo, Y. S.; Choi, S.; Yoon, D. B.

    1999-04-01

    The ultimate goal of this project is to establish the analysis technique to diagnose the integrity of reactor internals using reactor noise. The reactor noise analyses techniques for the PWR and CANDU NPP(Nuclear Power Plants) were established by which the dynamic characteristics of reactor internals and SPND instrumentations could be identified, and the noise database corresponding to each plant(both Korean and foreign one) was constructed and compared. Also the change of dynamic characteristics of the Ulchin 1 and 2 reactor internals were simulated under presumed fault conditions. Additionally portable reactor noise analysis system was developed so that real time noise analysis could directly be able to be performed at plant site. The reactor noise analyses techniques developed and the database obtained from the fault simulation, can be used to establish a knowledge based expert system to diagnose the NPP's abnormal conditions. And the portable reactor noise analysis system may be utilized as a substitute for plant IVMS(Internal Vibration Monitoring System). (author)

  11. Noble Gas Measurement and Analysis Technique for Monitoring Reprocessing Facilities

    International Nuclear Information System (INIS)

    William S. Charlton

    1999-01-01

    An environmental monitoring technique using analysis of stable noble gas isotopic ratios on-stack at a reprocessing facility was developed. This technique integrates existing technologies to strengthen safeguards at reprocessing facilities. The isotopic ratios are measured using a mass spectrometry system and are compared to a database of calculated isotopic ratios using a Bayesian data analysis method to determine specific fuel parameters (e.g., burnup, fuel type, fuel age, etc.). These inferred parameters can be used by investigators to verify operator declarations. A user-friendly software application (named NOVA) was developed for the application of this technique. NOVA included a Visual Basic user interface coupling a Bayesian data analysis procedure to a reactor physics database (calculated using the Monteburns 3.01 code system). The integrated system (mass spectrometry, reactor modeling, and data analysis) was validated using on-stack measurements during the reprocessing of target fuel from a U.S. production reactor and gas samples from the processing of EBR-II fast breeder reactor driver fuel. These measurements led to an inferred burnup that matched the declared burnup with sufficient accuracy and consistency for most safeguards applications. The NOVA code was also tested using numerous light water reactor measurements from the literature. NOVA was capable of accurately determining spent fuel type, burnup, and fuel age for these experimental results. Work should continue to demonstrate the robustness of this system for production, power, and research reactor fuels

  12. Nuclear techniques of analysis in diamond synthesis and annealing

    International Nuclear Information System (INIS)

    Jamieson, D. N.; Prawer, S.; Gonon, P.; Walker, R.; Dooley, S.; Bettiol, A.; Pearce, J.

    1996-01-01

    Nuclear techniques of analysis have played an important role in the study of synthetic and laser annealed diamond. These measurements have mainly used ion beam analysis with a focused MeV ion beam in a nuclear microprobe system. A variety of techniques have been employed. One of the most important is nuclear elastic scattering, sometimes called non-Rutherford scattering, which has been used to accurately characterise diamond films for thickness and composition. This is possible by the use of a database of measured scattering cross sections. Recently, this work has been extended and nuclear elastic scattering cross sections for both natural boron isotopes have been measured. For radiation damaged diamond, a focused laser annealing scheme has been developed which produces near complete regrowth of MeV phosphorus implanted diamonds. In the laser annealed regions, proton induced x-ray emission has been used to show that 50 % of the P atoms occupy lattice sites. This opens the way to produce n-type diamond for microelectronic device applications. All these analytical applications utilize a focused MeV microbeam which is ideally suited for diamond analysis. This presentation reviews these applications, as well as the technology of nuclear techniques of analysis for diamond with a focused beam. 9 refs., 6 figs

  13. Development of fault diagnostic technique using reactor noise analysis

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jin Ho; Kim, J. S.; Oh, I. S.; Ryu, J. S.; Joo, Y. S.; Choi, S.; Yoon, D. B

    1999-04-01

    The ultimate goal of this project is to establish the analysis technique to diagnose the integrity of reactor internals using reactor noise. The reactor noise analyses techniques for the PWR and CANDU NPP(Nuclear Power Plants) were established by which the dynamic characteristics of reactor internals and SPND instrumentations could be identified, and the noise database corresponding to each plant(both Korean and foreign one) was constructed and compared. Also the change of dynamic characteristics of the Ulchin 1 and 2 reactor internals were simulated under presumed fault conditions. Additionally portable reactor noise analysis system was developed so that real time noise analysis could directly be able to be performed at plant site. The reactor noise analyses techniques developed and the database obtained from the fault simulation, can be used to establish a knowledge based expert system to diagnose the NPP's abnormal conditions. And the portable reactor noise analysis system may be utilized as a substitute for plant IVMS(Internal Vibration Monitoring System). (author)

  14. New trends in sample preparation techniques for environmental analysis.

    Science.gov (United States)

    Ribeiro, Cláudia; Ribeiro, Ana Rita; Maia, Alexandra S; Gonçalves, Virgínia M F; Tiritan, Maria Elizabeth

    2014-01-01

    Environmental samples include a wide variety of complex matrices, with low concentrations of analytes and presence of several interferences. Sample preparation is a critical step and the main source of uncertainties in the analysis of environmental samples, and it is usually laborious, high cost, time consuming, and polluting. In this context, there is increasing interest in developing faster, cost-effective, and environmentally friendly sample preparation techniques. Recently, new methods have been developed and optimized in order to miniaturize extraction steps, to reduce solvent consumption or become solventless, and to automate systems. This review attempts to present an overview of the fundamentals, procedure, and application of the most recently developed sample preparation techniques for the extraction, cleanup, and concentration of organic pollutants from environmental samples. These techniques include: solid phase microextraction, on-line solid phase extraction, microextraction by packed sorbent, dispersive liquid-liquid microextraction, and QuEChERS (Quick, Easy, Cheap, Effective, Rugged and Safe).

  15. Model order reduction techniques with applications in finite element analysis

    CERN Document Server

    Qu, Zu-Qing

    2004-01-01

    Despite the continued rapid advance in computing speed and memory the increase in the complexity of models used by engineers persists in outpacing them. Even where there is access to the latest hardware, simulations are often extremely computationally intensive and time-consuming when full-blown models are under consideration. The need to reduce the computational cost involved when dealing with high-order/many-degree-of-freedom models can be offset by adroit computation. In this light, model-reduction methods have become a major goal of simulation and modeling research. Model reduction can also ameliorate problems in the correlation of widely used finite-element analyses and test analysis models produced by excessive system complexity. Model Order Reduction Techniques explains and compares such methods focusing mainly on recent work in dynamic condensation techniques: - Compares the effectiveness of static, exact, dynamic, SEREP and iterative-dynamic condensation techniques in producing valid reduced-order mo...

  16. Novel technique for coal pyrolysis and hydrogenation production analysis

    Energy Technology Data Exchange (ETDEWEB)

    Pfefferle, L.D.

    1990-01-01

    The overall objective of this study is to establish vacuum ultraviolet photoionization-MS and VUV pulsed EI-MS as useful tools for a simpler and more accurate direct mass spectrometric measurement of a broad range of hydrocarbon compounds in complex mixtures for ultimate application to the study of the kinetics of coal hydrogenation and pyrolysis processes. The VUV-MS technique allows ionization of a broad range of species with minimal fragmentation. Many compounds of interest can be detected with the 118 nm wavelength, but additional compound selectivity is achievable by tuning the wavelength of the photo-ionization source in the VUV. Resonant four wave mixing techniques in Hg vapor will allow near continuous tuning from about 126 to 106 nm. This technique would facilitate the scientific investigation of coal upgrading processes such as pyrolysis and hydrogenation by allowing accurate direct analysis of both stable and intermediate reaction products.

  17. Fractal time series analysis of postural stability in elderly and control subjects

    Directory of Open Access Journals (Sweden)

    Doussot Michel

    2007-05-01

    Full Text Available Abstract Background The study of balance using stabilogram analysis is of particular interest in the study of falls. Although simple statistical parameters derived from the stabilogram have been shown to predict risk of falls, such measures offer little insight into the underlying control mechanisms responsible for degradation in balance. In contrast, fractal and non-linear time-series analysis of stabilograms, such as estimations of the Hurst exponent (H, may provide information related to the underlying motor control strategies governing postural stability. In order to be adapted for a home-based follow-up of balance, such methods need to be robust, regardless of the experimental protocol, while producing time-series that are as short as possible. The present study compares two methods of calculating H: Detrended Fluctuation Analysis (DFA and Stabilogram Diffusion Analysis (SDA for elderly and control subjects, as well as evaluating the effect of recording duration. Methods Centre of pressure signals were obtained from 90 young adult subjects and 10 elderly subjects. Data were sampled at 100 Hz for 30 s, including stepping onto and off the force plate. Estimations of H were made using sliding windows of 10, 5, and 2.5 s durations, with windows slid forward in 1-s increments. Multivariate analysis of variance was used to test for the effect of time, age and estimation method on the Hurst exponent, while the intra-class correlation coefficient (ICC was used as a measure of reliability. Results Both SDA and DFA methods were able to identify differences in postural stability between control and elderly subjects for time series as short as 5 s, with ICC values as high as 0.75 for DFA. Conclusion Both methods would be well-suited to non-invasive longitudinal assessment of balance. In addition, reliable estimations of H were obtained from time series as short as 5 s.

  18. Data Rods: High Speed, Time-Series Analysis of Massive Cryospheric Data Sets Using Object-Oriented Database Methods

    Science.gov (United States)

    Liang, Y.; Gallaher, D. W.; Grant, G.; Lv, Q.

    2011-12-01

    Change over time, is the central driver of climate change detection. The goal is to diagnose the underlying causes, and make projections into the future. In an effort to optimize this process we have developed the Data Rod model, an object-oriented approach that provides the ability to query grid cell changes and their relationships to neighboring grid cells through time. The time series data is organized in time-centric structures called "data rods." A single data rod can be pictured as the multi-spectral data history at one grid cell: a vertical column of data through time. This resolves the long-standing problem of managing time-series data and opens new possibilities for temporal data analysis. This structure enables rapid time- centric analysis at any grid cell across multiple sensors and satellite platforms. Collections of data rods can be spatially and temporally filtered, statistically analyzed, and aggregated for use with pattern matching algorithms. Likewise, individual image pixels can be extracted to generate multi-spectral imagery at any spatial and temporal location. The Data Rods project has created a series of prototype databases to store and analyze massive datasets containing multi-modality remote sensing data. Using object-oriented technology, this method overcomes the operational limitations of traditional relational databases. To demonstrate the speed and efficiency of time-centric analysis using the Data Rods model, we have developed a sea ice detection algorithm. This application determines the concentration of sea ice in a small spatial region across a long temporal window. If performed using traditional analytical techniques, this task would typically require extensive data downloads and spatial filtering. Using Data Rods databases, the exact spatio-temporal data set is immediately available No extraneous data is downloaded, and all selected data querying occurs transparently on the server side. Moreover, fundamental statistical

  19. Compositional analysis of YBaCuO superconducting films with ion beam analysis techniques

    International Nuclear Information System (INIS)

    Jones, S.; Timmers, H.; Ophel, T.R.; Elliman, R.G.

    1999-01-01

    High-T c YBa x Cu y O 7-δ superconducting films are being developed for applications such as superconducting quantum interference devices. The carrier concentration, critical current density J c and critical temperature T c of these films depend sensitively on the oxygen content . Stoichiometry, uniformity with depth, homogeneity across the sample and film thickness are also important quantities for their characterisation. It has been shown, for example, that the stoichiometry of the metallic elements affects the growth characteristics and surface morphology of the films. With the deposit ion techniques used, reproducibility of film properties is difficult. The characterisation of YBa x Cu y O 7-δ films with ion beam analysis techniques is complex. Whereas the three metallic elements can be detected with helium beams and Rutherford Backscattering (RBS), the oxygen signal is generally obscured by that from substrate elements. It can be better detected using resonant backscattering with 3.04MeV 4 He ions or nuclear reaction analysis. Elastic Recoil Detection (ERD) with high-energetic (1MeV/amu), heavy beams (Z > 120), enables all elements to be detected and separated in a single experiment. It is well established that ion bombardment induces vacancies in the oxygen sub-lattice, driving the material to change from crystalline to amorphous, the latter phase having a reduced oxygen content. In previous heavy ion ERD measurements of YBa x Cu yO z films with 200MeV 127 I beams, the opaque films became transparent in the beam spot area, indicative of the amorphous phase. The accuracy of the oxygen measurement is therefore questionable. Indeed, using Raman spectroscopy, distortions of the crystalline structure above a fluence of 5 x 10 11 ion/cm 2 and for higher doses some signatures of a reduction in oxygen content have been observed for such beams. It appears therefore that a correct determination of the oxygen content requires either a drastic reduction in fluence or a

  20. Small area analysis using micro-diffraction techniques

    International Nuclear Information System (INIS)

    Goehner, Raymond P.; Tissot, Ralph G. Jr.; Michael, Joseph R.

    2000-01-01

    An overall trend toward smaller electronic packages and devices makes it increasingly important and difficult to obtain meaningful diffraction information from small areas. X-ray micro-diffraction, electron back-scattered diffraction (EBSD) and Kossel are micro-diffraction techniques used for crystallographic analysis including texture, phase identification and strain measurements. X-ray micro-diffraction primarily is used for phase analysis and residual strain measurements. X-ray micro-diffraction primarily is used for phase analysis and residual strain measurements of areas between 10 microm to 100 microm. For areas this small glass capillary optics are used for producing a usable collimated x-ray beam. These optics are designed to reflect x-rays below the critical angle therefore allowing for larger solid acceptance angle at the x-ray source resulting in brighter smaller x-ray beams. The determination of residual strain using micro-diffraction techniques is very important to the semiconductor industry. Residual stresses have caused voiding of the interconnect metal which then destroys electrical continuity. Being able to determine the residual stress helps industry to predict failures from the aging effects of interconnects due to this stress voiding. Stress measurements would be impossible using a conventional x-ray diffractometer; however, utilizing a 30 microm glass capillary these small areas are readily assessable for analysis. Kossel produces a wide angle diffraction pattern from fluorescent x-rays generated in the sample by an e-beam in a SEM. This technique can yield very precise lattice parameters for determining strain. Fig. 2 shows a Kossel pattern from a Ni specimen. Phase analysis on small areas is also possible using an energy dispersive spectrometer (EBSD) and x-ray micro-diffraction techniques. EBSD has the advantage of allowing the user to observe the area of interest using the excellent imaging capabilities of the SEM. An EDS detector has been

  1. Modular techniques for dynamic fault-tree analysis

    Science.gov (United States)

    Patterson-Hine, F. A.; Dugan, Joanne B.

    1992-01-01

    It is noted that current approaches used to assess the dependability of complex systems such as Space Station Freedom and the Air Traffic Control System are incapable of handling the size and complexity of these highly integrated designs. A novel technique for modeling such systems which is built upon current techniques in Markov theory and combinatorial analysis is described. It enables the development of a hierarchical representation of system behavior which is more flexible than either technique alone. A solution strategy which is based on an object-oriented approach to model representation and evaluation is discussed. The technique is virtually transparent to the user since the fault tree models can be built graphically and the objects defined automatically. The tree modularization procedure allows the two model types, Markov and combinatoric, to coexist and does not require that the entire fault tree be translated to a Markov chain for evaluation. This effectively reduces the size of the Markov chain required and enables solutions with less truncation, making analysis of longer mission times possible. Using the fault-tolerant parallel processor as an example, a model is built and solved for a specific mission scenario and the solution approach is illustrated in detail.

  2. Regional Land Subsidence Analysis in Eastern Beijing Plain by InSAR Time Series and Wavelet Transforms

    Directory of Open Access Journals (Sweden)

    Mingliang Gao

    2018-02-01

    Full Text Available Land subsidence is the disaster phenomenon of environmental geology with regionally surface altitude lowering caused by the natural or man-made factors. Beijing, the capital city of China, has suffered from land subsidence since the 1950s, and extreme groundwater extraction has led to subsidence rates of more than 100 mm/year. In this study, we employ two SAR datasets acquired by Envisat and TerraSAR-X satellites to investigate the surface deformation in Beijing Plain from 2003 to 2013 based on the multi-temporal InSAR technique. Furthermore, we also use observation wells to provide in situ hydraulic head levels to perform the evolution of land subsidence and spatial-temporal changes of groundwater level. Then, we analyze the accumulated displacement and hydraulic head level time series using continuous wavelet transform to separate periodic signal components. Finally, cross wavelet transform (XWT and wavelet transform coherence (WTC are implemented to analyze the relationship between the accumulated displacement and hydraulic head level time series. The results show that the subsidence centers in the northern Beijing Plain is spatially consistent with the groundwater drop funnels. According to the analysis of well based results located in different areas, the long-term groundwater exploitation in the northern subsidence area has led to the continuous decline of the water level, resulting in the inelastic and permanent compaction, while for the monitoring wells located outside the subsidence area, the subsidence time series show obvious elastic deformation characteristics (seasonal characteristics as the groundwater level changes. Moreover, according to the wavelet transformation, the land subsidence time series at monitoring well site lags several months behind the groundwater level change.

  3. A review of residual stress analysis using thermoelastic techniques

    Energy Technology Data Exchange (ETDEWEB)

    Robinson, A F; Dulieu-Barton, J M; Quinn, S [University of Southampton, School of Engineering Sciences, Highfield, Southampton, SO17 1BJ (United Kingdom); Burguete, R L [Airbus UK Ltd., New Filton House, Filton, Bristol, BS99 7AR (United Kingdom)

    2009-08-01

    Thermoelastic Stress Analysis (TSA) is a full-field technique for experimental stress analysis that is based on infra-red thermography. The technique has proved to be extremely effective for studying elastic stress fields and is now well established. It is based on the measurement of the temperature change that occurs as a result of a stress change. As residual stress is essentially a mean stress it is accepted that the linear form of the TSA relationship cannot be used to evaluate residual stresses. However, there are situations where this linear relationship is not valid or departures in material properties due to manufacturing procedures have enabled evaluations of residual stresses. The purpose of this paper is to review the current status of using a TSA based approach for the evaluation of residual stresses and to provide some examples of where promising results have been obtained.

  4. A review of residual stress analysis using thermoelastic techniques

    International Nuclear Information System (INIS)

    Robinson, A F; Dulieu-Barton, J M; Quinn, S; Burguete, R L

    2009-01-01

    Thermoelastic Stress Analysis (TSA) is a full-field technique for experimental stress analysis that is based on infra-red thermography. The technique has proved to be extremely effective for studying elastic stress fields and is now well established. It is based on the measurement of the temperature change that occurs as a result of a stress change. As residual stress is essentially a mean stress it is accepted that the linear form of the TSA relationship cannot be used to evaluate residual stresses. However, there are situations where this linear relationship is not valid or departures in material properties due to manufacturing procedures have enabled evaluations of residual stresses. The purpose of this paper is to review the current status of using a TSA based approach for the evaluation of residual stresses and to provide some examples of where promising results have been obtained.

  5. Technique Triangulation for Validation in Directed Content Analysis

    Directory of Open Access Journals (Sweden)

    Áine M. Humble PhD

    2009-09-01

    Full Text Available Division of labor in wedding planning varies for first-time marriages, with three types of couples—traditional, transitional, and egalitarian—identified, but nothing is known about wedding planning for remarrying individuals. Using semistructured interviews, the author interviewed 14 couples in which at least one person had remarried and used directed content analysis to investigate the extent to which the aforementioned typology could be transferred to this different context. In this paper she describes how a triangulation of analytic techniques provided validation for couple classifications and also helped with moving beyond “blind spots” in data analysis. Analytic approaches were the constant comparative technique, rank order comparison, and visual representation of coding, using MAXQDA 2007's tool called TextPortraits.

  6. A BWR 24-month cycle analysis using multicycle techniques

    International Nuclear Information System (INIS)

    Hartley, K.D.

    1993-01-01

    Boiling water reactor (BWR) fuel cycle design analyses have become increasingly challenging in the past several years. As utilities continue to seek improved capacity factors, reduced power generation costs, and reduced outage costs, longer cycle lengths and fuel design optimization become important considerations. Accurate multicycle analysis techniques are necessary to determine the viability of fuel designs and cycle operating strategies to meet reactor operating requirements, e.g., meet thermal and reactivity margin constraints, while minimizing overall fuel cycle costs. Siemens Power Corporation (SPC), Nuclear Division, has successfully employed multi-cycle analysis techniques with realistic rodded cycle depletions to demonstrate equilibrium fuel cycle performance in 24-month cycles. Analyses have been performed by a BWR/5 reactor, at both rated and uprated power conditions

  7. Runoff Potentiality of a Watershed through SCS and Functional Data Analysis Technique

    Science.gov (United States)

    Adham, M. I.; Shirazi, S. M.; Othman, F.; Rahman, S.; Yusop, Z.; Ismail, Z.

    2014-01-01

    Runoff potentiality of a watershed was assessed based on identifying curve number (CN), soil conservation service (SCS), and functional data analysis (FDA) techniques. Daily discrete rainfall data were collected from weather stations in the study area and analyzed through lowess method for smoothing curve. As runoff data represents a periodic pattern in each watershed, Fourier series was introduced to fit the smooth curve of eight watersheds. Seven terms of Fourier series were introduced for the watersheds 5 and 8, while 8 terms of Fourier series were used for the rest of the watersheds for the best fit of data. Bootstrapping smooth curve analysis reveals that watersheds 1, 2, 3, 6, 7, and 8 are with monthly mean runoffs of 29, 24, 22, 23, 26, and 27 mm, respectively, and these watersheds would likely contribute to surface runoff in the study area. The purpose of this study was to transform runoff data into a smooth curve for representing the surface runoff pattern and mean runoff of each watershed through statistical method. This study provides information of runoff potentiality of each watershed and also provides input data for hydrological modeling. PMID:25152911

  8. Runoff Potentiality of a Watershed through SCS and Functional Data Analysis Technique

    Directory of Open Access Journals (Sweden)

    M. I. Adham

    2014-01-01

    Full Text Available Runoff potentiality of a watershed was assessed based on identifying curve number (CN, soil conservation service (SCS, and functional data analysis (FDA techniques. Daily discrete rainfall data were collected from weather stations in the study area and analyzed through lowess method for smoothing curve. As runoff data represents a periodic pattern in each watershed, Fourier series was introduced to fit the smooth curve of eight watersheds. Seven terms of Fourier series were introduced for the watersheds 5 and 8, while 8 terms of Fourier series were used for the rest of the watersheds for the best fit of data. Bootstrapping smooth curve analysis reveals that watersheds 1, 2, 3, 6, 7, and 8 are with monthly mean runoffs of 29, 24, 22, 23, 26, and 27 mm, respectively, and these watersheds would likely contribute to surface runoff in the study area. The purpose of this study was to transform runoff data into a smooth curve for representing the surface runoff pattern and mean runoff of each watershed through statistical method. This study provides information of runoff potentiality of each watershed and also provides input data for hydrological modeling.

  9. Ion beam analysis techniques applied to large scale pollution studies

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, D D; Bailey, G; Martin, J; Garton, D; Noorman, H; Stelcer, E; Johnson, P [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1994-12-31

    Ion Beam Analysis (IBA) techniques are ideally suited to analyse the thousands of filter papers a year that may originate from a large scale aerosol sampling network. They are fast multi-elemental and, for the most part, non-destructive so other analytical methods such as neutron activation and ion chromatography can be performed afterwards. ANSTO in collaboration with the NSW EPA, Pacific Power and the Universities of NSW and Macquarie has established a large area fine aerosol sampling network covering nearly 80,000 square kilometres of NSW with 25 fine particle samplers. This network known as ASP was funded by the Energy Research and Development Corporation (ERDC) and commenced sampling on 1 July 1991. The cyclone sampler at each site has a 2.5 {mu}m particle diameter cut off and runs for 24 hours every Sunday and Wednesday using one Gillman 25mm diameter stretched Teflon filter for each day. These filters are ideal targets for ion beam analysis work. Currently ANSTO receives 300 filters per month from this network for analysis using its accelerator based ion beam techniques on the 3 MV Van de Graaff accelerator. One week a month of accelerator time is dedicated to this analysis. Four simultaneous accelerator based IBA techniques are used at ANSTO, to analyse for the following 24 elements: H, C, N, O, F, Na, Al, Si, P, S, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Cu, Ni, Co, Zn, Br and Pb. The IBA techniques were proved invaluable in identifying sources of fine particles and their spatial and seasonal variations accross the large area sampled by the ASP network. 3 figs.

  10. Ion beam analysis techniques applied to large scale pollution studies

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, D.D.; Bailey, G.; Martin, J.; Garton, D.; Noorman, H.; Stelcer, E.; Johnson, P. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1993-12-31

    Ion Beam Analysis (IBA) techniques are ideally suited to analyse the thousands of filter papers a year that may originate from a large scale aerosol sampling network. They are fast multi-elemental and, for the most part, non-destructive so other analytical methods such as neutron activation and ion chromatography can be performed afterwards. ANSTO in collaboration with the NSW EPA, Pacific Power and the Universities of NSW and Macquarie has established a large area fine aerosol sampling network covering nearly 80,000 square kilometres of NSW with 25 fine particle samplers. This network known as ASP was funded by the Energy Research and Development Corporation (ERDC) and commenced sampling on 1 July 1991. The cyclone sampler at each site has a 2.5 {mu}m particle diameter cut off and runs for 24 hours every Sunday and Wednesday using one Gillman 25mm diameter stretched Teflon filter for each day. These filters are ideal targets for ion beam analysis work. Currently ANSTO receives 300 filters per month from this network for analysis using its accelerator based ion beam techniques on the 3 MV Van de Graaff accelerator. One week a month of accelerator time is dedicated to this analysis. Four simultaneous accelerator based IBA techniques are used at ANSTO, to analyse for the following 24 elements: H, C, N, O, F, Na, Al, Si, P, S, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Cu, Ni, Co, Zn, Br and Pb. The IBA techniques were proved invaluable in identifying sources of fine particles and their spatial and seasonal variations accross the large area sampled by the ASP network. 3 figs.

  11. Analysis of Cell Phone Usage Using Correlation Techniques

    OpenAIRE

    T S R MURTHY; D. SIVA RAMA KRISHNA

    2011-01-01

    The present paper is a sample survey analysis, examined based on correlation techniques. The usage ofmobile phones is clearly almost un-avoidable these days and as such the authors have made a systematicsurvey through a well prepared questionnaire on making use of mobile phones to the maximum extent.These samples are various economical groups across a population of over one-lakh people. The resultsare scientifically categorized and interpreted to match the ground reality.

  12. An evaluation of directional analysis techniques for multidirectional, partially reflected waves .1. numerical investigations

    DEFF Research Database (Denmark)

    Ilic, C; Chadwick, A; Helm-Petersen, Jacob

    2000-01-01

    , non-phased locked methods are more appropriate. In this paper, the accuracy of two non-phased locked methods of directional analysis, the maximum likelihood method (MLM) and the Bayesian directional method (BDM) have been quantitatively evaluated using numerical simulations for the case...... of multidirectional waves with partial reflections. It is shown that the results are influenced by the ratio of distance from the reflector (L) to the length of the time series (S) used in the spectral analysis. Both methods are found to be capable of determining the incident and reflective wave fields when US > 0......Recent studies of advanced directional analysis techniques have mainly centred on incident wave fields. In the study of coastal structures, however, partially reflective wave fields are commonly present. In the near structure field, phase locked methods can be successfully applied. In the far field...

  13. A case study of the sensitivity of forecast skill to data and data analysis techniques

    Science.gov (United States)

    Baker, W. E.; Atlas, R.; Halem, M.; Susskind, J.

    1983-01-01

    A series of experiments have been conducted to examine the sensitivity of forecast skill to various data and data analysis techniques for the 0000 GMT case of January 21, 1979. These include the individual components of the FGGE observing system, the temperatures obtained with different satellite retrieval methods, and the method of vertical interpolation between the mandatory pressure analysis levels and the model sigma levels. It is found that NESS TIROS-N infrared retrievals seriously degrade a rawinsonde-only analysis over land, resulting in a poorer forecast over North America. Less degradation in the 72-hr forecast skill at sea level and some improvement at 500 mb is noted, relative to the control with TIROS-N retrievals produced with a physical inversion method which utilizes a 6-hr forecast first guess. NESS VTPR oceanic retrievals lead to an improved forecast over North America when added to the control.

  14. Identifying secondary series for stepwise common singular spectrum ...

    African Journals Online (AJOL)

    Abstract. Common singular spectrum analysis is a technique which can be used to forecast a pri- mary time series by using the information from a secondary series. Not all secondary series, however, provide useful information. A first contribution in this paper is to point out the properties which a secondary series should ...

  15. Flood Frequency Analysis For Partial Duration Series In Ganjiang River Basin

    Science.gov (United States)

    zhangli, Sun; xiufang, Zhu; yaozhong, Pan

    2016-04-01

    Accurate estimation of flood frequency is key to effective, nationwide flood damage abatement programs. The partial duration series (PDS) method is widely used in hydrologic studies because it considers all events above a certain threshold level as compared to the annual maximum series (AMS) method, which considers only the annual maximum value. However, the PDS has a drawback in that it is difficult to define the thresholds and maintain an independent and identical distribution of the partial duration time series; this drawback is discussed in this paper. The Ganjiang River is the seventh largest tributary of the Yangtze River, the longest river in China. The Ganjiang River covers a drainage area of 81,258 km2 at the Wanzhou hydrologic station as the basin outlet. In this work, 56 years of daily flow data (1954-2009) from the Wanzhou station were used to analyze flood frequency, and the Pearson-III model was employed as the hydrologic probability distribution. Generally, three tasks were accomplished: (1) the threshold of PDS by percentile rank of daily runoff was obtained; (2) trend analysis of the flow series was conducted using PDS; and (3) flood frequency analysis was conducted for partial duration flow series. The results showed a slight upward trend of the annual runoff in the Ganjiang River basin. The maximum flow with a 0.01 exceedance probability (corresponding to a 100-year flood peak under stationary conditions) was 20,000 m3/s, while that with a 0.1 exceedance probability was 15,000 m3/s. These results will serve as a guide to hydrological engineering planning, design, and management for policymakers and decision makers associated with hydrology.

  16. Analysis of diagnostic calorimeter data by the transfer function technique

    Energy Technology Data Exchange (ETDEWEB)

    Delogu, R. S., E-mail: rita.delogu@igi.cnr.it; Pimazzoni, A.; Serianni, G. [Consorzio RFX, Corso Stati Uniti, 35127 Padova (Italy); Poggi, C.; Rossi, G. [Università degli Studi di Padova, Via 8 Febbraio 1848, 35122 Padova (Italy)

    2016-02-15

    This paper describes the analysis procedure applied to the thermal measurements on the rear side of a carbon fibre composite calorimeter with the purpose of reconstructing the energy flux due to an ion beam colliding on the front side. The method is based on the transfer function technique and allows a fast analysis by means of the fast Fourier transform algorithm. Its efficacy has been tested both on simulated and measured temperature profiles: in all cases, the energy flux features are well reproduced and beamlets are well resolved. Limits and restrictions of the method are also discussed, providing strategies to handle issues related to signal noise and digital processing.

  17. FDTD technique based crosstalk analysis of bundled SWCNT interconnects

    International Nuclear Information System (INIS)

    Duksh, Yograj Singh; Kaushik, Brajesh Kumar; Agarwal, Rajendra P.

    2015-01-01

    The equivalent electrical circuit model of a bundled single-walled carbon nanotube based distributed RLC interconnects is employed for the crosstalk analysis. The accurate time domain analysis and crosstalk effect in the VLSI interconnect has emerged as an essential design criteria. This paper presents a brief description of the numerical method based finite difference time domain (FDTD) technique that is intended for estimation of voltages and currents on coupled transmission lines. For the FDTD implementation, the stability of the proposed model is strictly restricted by the Courant condition. This method is used for the estimation of crosstalk induced propagation delay and peak voltage in lossy RLC interconnects. Both functional and dynamic crosstalk effects are analyzed in the coupled transmission line. The effect of line resistance on crosstalk induced delay, and peak voltage under dynamic and functional crosstalk is also evaluated. The FDTD analysis and the SPICE simulations are carried out at 32 nm technology node for the global interconnects. It is observed that the analytical results obtained using the FDTD technique are in good agreement with the SPICE simulation results. The crosstalk induced delay, propagation delay, and peak voltage obtained using the FDTD technique shows average errors of 4.9%, 3.4% and 0.46%, respectively, in comparison to SPICE. (paper)

  18. Time-series analysis of climatologic measurements: a method to distinguish future climatic changes

    International Nuclear Information System (INIS)

    Duband, D.

    1992-01-01

    Time-series analysis of climatic parameters as air temperature, rivers flow rate, lakes or seas level is an indispensable basis to detect a possible significant climatic change. These observations, when they are carefully analyzed and criticized, constitute the necessary reference for testing and validation numerical climatic models which try to simulate the physical and dynamical process of the ocean-atmosphere couple, taking continents into account. 32 refs., 13 figs

  19. Chernobyl effects on domestic and inbound tourism in Sweden. A time series analysis

    International Nuclear Information System (INIS)

    Hultkrantz, L.; Olsson, C.

    1997-01-01

    This paper estimates the impact of the Chernobyl nuclear accident on domestic and international tourism in Sweden. From ARIMA time series forecasts, outlier search, and intervention analysis based on regional monthly accommodation data from 1978-1989, no effect on domestic tourism is found. However, there is an enduring deterrence effect on incoming tourism. The loss of gross revenue from incoming tourism because of the Chernobyl accident, is estimated to 2.5 billion SEK. 5 figs., 7 tabs., 1 appendix, 27 refs

  20. A Time Series Analysis to Asymmetric Marketing Competition Within a Market Structure

    OpenAIRE

    Francisco F. R. Ramos

    1996-01-01

    As a complementary to the existing studies of competitive market structure analysis, the present paper proposed a time series methodology to provide a more detailed picture of marketing competition in relation to competitive market structure. Two major hypotheses were tested as part of this project. First, it was found that some significant cross- lead and lag effects of marketing variables on sales between brands existed even between differents submarkets. second, it was found that high qual...