WorldWideScience

Sample records for integrated moving-average processes

  1. Monthly streamflow forecasting with auto-regressive integrated moving average

    Science.gov (United States)

    Nasir, Najah; Samsudin, Ruhaidah; Shabri, Ani

    2017-09-01

    Forecasting of streamflow is one of the many ways that can contribute to better decision making for water resource management. The auto-regressive integrated moving average (ARIMA) model was selected in this research for monthly streamflow forecasting with enhancement made by pre-processing the data using singular spectrum analysis (SSA). This study also proposed an extension of the SSA technique to include a step where clustering was performed on the eigenvector pairs before reconstruction of the time series. The monthly streamflow data of Sungai Muda at Jeniang, Sungai Muda at Jambatan Syed Omar and Sungai Ketil at Kuala Pegang was gathered from the Department of Irrigation and Drainage Malaysia. A ratio of 9:1 was used to divide the data into training and testing sets. The ARIMA, SSA-ARIMA and Clustered SSA-ARIMA models were all developed in R software. Results from the proposed model are then compared to a conventional auto-regressive integrated moving average model using the root-mean-square error and mean absolute error values. It was found that the proposed model can outperform the conventional model.

  2. Modeling methane emission via the infinite moving average process

    Czech Academy of Sciences Publication Activity Database

    Jordanova, D.; Dušek, Jiří; Stehlík, M.

    2013-01-01

    Roč. 122, - (2013), s. 40-49 ISSN 0169-7439 R&D Projects: GA MŠk(CZ) ED1.1.00/02.0073; GA ČR(CZ) GAP504/11/1151 Institutional support: RVO:67179843 Keywords : Environmental chemistry * Pareto tails * t-Hill estimator * Weak consistency * Moving average process * Methane emission model Subject RIV: EH - Ecology, Behaviour Impact factor: 2.381, year: 2013

  3. Autoregressive Moving Average Graph Filtering

    OpenAIRE

    Isufi, Elvin; Loukas, Andreas; Simonetto, Andrea; Leus, Geert

    2016-01-01

    One of the cornerstones of the field of signal processing on graphs are graph filters, direct analogues of classical filters, but intended for signals defined on graphs. This work brings forth new insights on the distributed graph filtering problem. We design a family of autoregressive moving average (ARMA) recursions, which (i) are able to approximate any desired graph frequency response, and (ii) give exact solutions for tasks such as graph signal denoising and interpolation. The design phi...

  4. Forecasting Rice Productivity and Production of Odisha, India, Using Autoregressive Integrated Moving Average Models

    Directory of Open Access Journals (Sweden)

    Rahul Tripathi

    2014-01-01

    Full Text Available Forecasting of rice area, production, and productivity of Odisha was made from the historical data of 1950-51 to 2008-09 by using univariate autoregressive integrated moving average (ARIMA models and was compared with the forecasted all Indian data. The autoregressive (p and moving average (q parameters were identified based on the significant spikes in the plots of partial autocorrelation function (PACF and autocorrelation function (ACF of the different time series. ARIMA (2, 1, 0 model was found suitable for all Indian rice productivity and production, whereas ARIMA (1, 1, 1 was best fitted for forecasting of rice productivity and production in Odisha. Prediction was made for the immediate next three years, that is, 2007-08, 2008-09, and 2009-10, using the best fitted ARIMA models based on minimum value of the selection criterion, that is, Akaike information criteria (AIC and Schwarz-Bayesian information criteria (SBC. The performances of models were validated by comparing with percentage deviation from the actual values and mean absolute percent error (MAPE, which was found to be 0.61 and 2.99% for the area under rice in Odisha and India, respectively. Similarly for prediction of rice production and productivity in Odisha and India, the MAPE was found to be less than 6%.

  5. Self-similarity of higher-order moving averages

    Science.gov (United States)

    Arianos, Sergio; Carbone, Anna; Türk, Christian

    2011-10-01

    In this work, higher-order moving average polynomials are defined by straightforward generalization of the standard moving average. The self-similarity of the polynomials is analyzed for fractional Brownian series and quantified in terms of the Hurst exponent H by using the detrending moving average method. We prove that the exponent H of the fractional Brownian series and of the detrending moving average variance asymptotically agree for the first-order polynomial. Such asymptotic values are compared with the results obtained by the simulations. The higher-order polynomials correspond to trend estimates at shorter time scales as the degree of the polynomial increases. Importantly, the increase of polynomial degree does not require to change the moving average window. Thus trends at different time scales can be obtained on data sets with the same size. These polynomials could be interesting for those applications relying on trend estimates over different time horizons (financial markets) or on filtering at different frequencies (image analysis).

  6. PERAMALAN PERSEDIAAN INFUS MENGGUNAKAN METODE AUTOREGRESSIVE INTEGRATED MOVING AVERAGE (ARIMA) PADA RUMAH SAKIT UMUM PUSAT SANGLAH

    OpenAIRE

    I PUTU YUDI PRABHADIKA; NI KETUT TARI TASTRAWATI; LUH PUTU IDA HARINI

    2018-01-01

    Infusion supplies are an important thing that must be considered by the hospital in meeting the needs of patients. This study aims to predict the need for infusion of 0.9% 500 ml of NaCl and 5% 500 ml glucose infusion at Sanglah General Hospital (RSUP) Sanglah so that the hospital can estimate the many infusions needed for the next six months. The forecasting method used in this research is the autoregressive integrated moving average (ARIMA) time series method. The results of this study indi...

  7. Medium term municipal solid waste generation prediction by autoregressive integrated moving average

    International Nuclear Information System (INIS)

    Younes, Mohammad K.; Nopiah, Z. M.; Basri, Noor Ezlin A.; Basri, Hassan

    2014-01-01

    Generally, solid waste handling and management are performed by municipality or local authority. In most of developing countries, local authorities suffer from serious solid waste management (SWM) problems and insufficient data and strategic planning. Thus it is important to develop robust solid waste generation forecasting model. It helps to proper manage the generated solid waste and to develop future plan based on relatively accurate figures. In Malaysia, solid waste generation rate increases rapidly due to the population growth and new consumption trends that characterize the modern life style. This paper aims to develop monthly solid waste forecasting model using Autoregressive Integrated Moving Average (ARIMA), such model is applicable even though there is lack of data and will help the municipality properly establish the annual service plan. The results show that ARIMA (6,1,0) model predicts monthly municipal solid waste generation with root mean square error equals to 0.0952 and the model forecast residuals are within accepted 95% confident interval

  8. Medium term municipal solid waste generation prediction by autoregressive integrated moving average

    Science.gov (United States)

    Younes, Mohammad K.; Nopiah, Z. M.; Basri, Noor Ezlin A.; Basri, Hassan

    2014-09-01

    Generally, solid waste handling and management are performed by municipality or local authority. In most of developing countries, local authorities suffer from serious solid waste management (SWM) problems and insufficient data and strategic planning. Thus it is important to develop robust solid waste generation forecasting model. It helps to proper manage the generated solid waste and to develop future plan based on relatively accurate figures. In Malaysia, solid waste generation rate increases rapidly due to the population growth and new consumption trends that characterize the modern life style. This paper aims to develop monthly solid waste forecasting model using Autoregressive Integrated Moving Average (ARIMA), such model is applicable even though there is lack of data and will help the municipality properly establish the annual service plan. The results show that ARIMA (6,1,0) model predicts monthly municipal solid waste generation with root mean square error equals to 0.0952 and the model forecast residuals are within accepted 95% confident interval.

  9. Medium term municipal solid waste generation prediction by autoregressive integrated moving average

    Energy Technology Data Exchange (ETDEWEB)

    Younes, Mohammad K.; Nopiah, Z. M.; Basri, Noor Ezlin A.; Basri, Hassan [Department of Civil and Structural Engineering, Faculty of Engineering and Built Environment, Universiti Kebangsaan Malaysia, 43600 Bangi, Selangor (Malaysia)

    2014-09-12

    Generally, solid waste handling and management are performed by municipality or local authority. In most of developing countries, local authorities suffer from serious solid waste management (SWM) problems and insufficient data and strategic planning. Thus it is important to develop robust solid waste generation forecasting model. It helps to proper manage the generated solid waste and to develop future plan based on relatively accurate figures. In Malaysia, solid waste generation rate increases rapidly due to the population growth and new consumption trends that characterize the modern life style. This paper aims to develop monthly solid waste forecasting model using Autoregressive Integrated Moving Average (ARIMA), such model is applicable even though there is lack of data and will help the municipality properly establish the annual service plan. The results show that ARIMA (6,1,0) model predicts monthly municipal solid waste generation with root mean square error equals to 0.0952 and the model forecast residuals are within accepted 95% confident interval.

  10. An Invariance Property for the Maximum Likelihood Estimator of the Parameters of a Gaussian Moving Average Process

    OpenAIRE

    Godolphin, E. J.

    1980-01-01

    It is shown that the estimation procedure of Walker leads to estimates of the parameters of a Gaussian moving average process which are asymptotically equivalent to the maximum likelihood estimates proposed by Whittle and represented by Godolphin.

  11. Relationship research between meteorological disasters and stock markets based on a multifractal detrending moving average algorithm

    Science.gov (United States)

    Li, Qingchen; Cao, Guangxi; Xu, Wei

    2018-01-01

    Based on a multifractal detrending moving average algorithm (MFDMA), this study uses the fractionally autoregressive integrated moving average process (ARFIMA) to demonstrate the effectiveness of MFDMA in the detection of auto-correlation at different sample lengths and to simulate some artificial time series with the same length as the actual sample interval. We analyze the effect of predictable and unpredictable meteorological disasters on the US and Chinese stock markets and the degree of long memory in different sectors. Furthermore, we conduct a preliminary investigation to determine whether the fluctuations of financial markets caused by meteorological disasters are derived from the normal evolution of the financial system itself or not. We also propose several reasonable recommendations.

  12. Capillary Electrophoresis Sensitivity Enhancement Based on Adaptive Moving Average Method.

    Science.gov (United States)

    Drevinskas, Tomas; Telksnys, Laimutis; Maruška, Audrius; Gorbatsova, Jelena; Kaljurand, Mihkel

    2018-06-05

    In the present work, we demonstrate a novel approach to improve the sensitivity of the "out of lab" portable capillary electrophoretic measurements. Nowadays, many signal enhancement methods are (i) underused (nonoptimal), (ii) overused (distorts the data), or (iii) inapplicable in field-portable instrumentation because of a lack of computational power. The described innovative migration velocity-adaptive moving average method uses an optimal averaging window size and can be easily implemented with a microcontroller. The contactless conductivity detection was used as a model for the development of a signal processing method and the demonstration of its impact on the sensitivity. The frequency characteristics of the recorded electropherograms and peaks were clarified. Higher electrophoretic mobility analytes exhibit higher-frequency peaks, whereas lower electrophoretic mobility analytes exhibit lower-frequency peaks. On the basis of the obtained data, a migration velocity-adaptive moving average algorithm was created, adapted, and programmed into capillary electrophoresis data-processing software. Employing the developed algorithm, each data point is processed depending on a certain migration time of the analyte. Because of the implemented migration velocity-adaptive moving average method, the signal-to-noise ratio improved up to 11 times for sampling frequency of 4.6 Hz and up to 22 times for sampling frequency of 25 Hz. This paper could potentially be used as a methodological guideline for the development of new smoothing algorithms that require adaptive conditions in capillary electrophoresis and other separation methods.

  13. Adaptive Moving Object Tracking Integrating Neural Networks And Intelligent Processing

    Science.gov (United States)

    Lee, James S. J.; Nguyen, Dziem D.; Lin, C.

    1989-03-01

    A real-time adaptive scheme is introduced to detect and track moving objects under noisy, dynamic conditions including moving sensors. This approach integrates the adaptiveness and incremental learning characteristics of neural networks with intelligent reasoning and process control. Spatiotemporal filtering is used to detect and analyze motion, exploiting the speed and accuracy of multiresolution processing. A neural network algorithm constitutes the basic computational structure for classification. A recognition and learning controller guides the on-line training of the network, and invokes pattern recognition to determine processing parameters dynamically and to verify detection results. A tracking controller acts as the central control unit, so that tracking goals direct the over-all system. Performance is benchmarked against the Widrow-Hoff algorithm, for target detection scenarios presented in diverse FLIR image sequences. Efficient algorithm design ensures that this recognition and control scheme, implemented in software and commercially available image processing hardware, meets the real-time requirements of tracking applications.

  14. Model Identification of Integrated ARMA Processes

    Science.gov (United States)

    Stadnytska, Tetiana; Braun, Simone; Werner, Joachim

    2008-01-01

    This article evaluates the Smallest Canonical Correlation Method (SCAN) and the Extended Sample Autocorrelation Function (ESACF), automated methods for the Autoregressive Integrated Moving-Average (ARIMA) model selection commonly available in current versions of SAS for Windows, as identification tools for integrated processes. SCAN and ESACF can…

  15. MARD—A moving average rose diagram application for the geosciences

    Science.gov (United States)

    Munro, Mark A.; Blenkinsop, Thomas G.

    2012-12-01

    MARD 1.0 is a computer program for generating smoothed rose diagrams by using a moving average, which is designed for use across the wide range of disciplines encompassed within the Earth Sciences. Available in MATLAB®, Microsoft® Excel and GNU Octave formats, the program is fully compatible with both Microsoft® Windows and Macintosh operating systems. Each version has been implemented in a user-friendly way that requires no prior experience in programming with the software. MARD conducts a moving average smoothing, a form of signal processing low-pass filter, upon the raw circular data according to a set of pre-defined conditions selected by the user. This form of signal processing filter smoothes the angular dataset, emphasising significant circular trends whilst reducing background noise. Customisable parameters include whether the data is uni- or bi-directional, the angular range (or aperture) over which the data is averaged, and whether an unweighted or weighted moving average is to be applied. In addition to the uni- and bi-directional options, the MATLAB® and Octave versions also possess a function for plotting 2-dimensional dips/pitches in a single, lower, hemisphere. The rose diagrams from each version are exportable as one of a selection of common graphical formats. Frequently employed statistical measures that determine the vector mean, mean resultant (or length), circular standard deviation and circular variance are also included. MARD's scope is demonstrated via its application to a variety of datasets within the Earth Sciences.

  16. On the speed towards the mean for continuous time autoregressive moving average processes with applications to energy markets

    International Nuclear Information System (INIS)

    Benth, Fred Espen; Taib, Che Mohd Imran Che

    2013-01-01

    We extend the concept of half life of an Ornstein–Uhlenbeck process to Lévy-driven continuous-time autoregressive moving average processes with stochastic volatility. The half life becomes state dependent, and we analyze its properties in terms of the characteristics of the process. An empirical example based on daily temperatures observed in Petaling Jaya, Malaysia, is presented, where the proposed model is estimated and the distribution of the half life is simulated. The stationarity of the dynamics yield futures prices which asymptotically tend to constant at an exponential rate when time to maturity goes to infinity. The rate is characterized by the eigenvalues of the dynamics. An alternative description of this convergence can be given in terms of our concept of half life. - Highlights: • The concept of half life is extended to Levy-driven continuous time autoregressive moving average processes • The dynamics of Malaysian temperatures are modeled using a continuous time autoregressive model with stochastic volatility • Forward prices on temperature become constant when time to maturity tends to infinity • Convergence in time to maturity is at an exponential rate given by the eigenvalues of the model temperature model

  17. Identification of moving vehicle forces on bridge structures via moving average Tikhonov regularization

    Science.gov (United States)

    Pan, Chu-Dong; Yu, Ling; Liu, Huan-Lin

    2017-08-01

    Traffic-induced moving force identification (MFI) is a typical inverse problem in the field of bridge structural health monitoring. Lots of regularization-based methods have been proposed for MFI. However, the MFI accuracy obtained from the existing methods is low when the moving forces enter into and exit a bridge deck due to low sensitivity of structural responses to the forces at these zones. To overcome this shortcoming, a novel moving average Tikhonov regularization method is proposed for MFI by combining with the moving average concepts. Firstly, the bridge-vehicle interaction moving force is assumed as a discrete finite signal with stable average value (DFS-SAV). Secondly, the reasonable signal feature of DFS-SAV is quantified and introduced for improving the penalty function (∣∣x∣∣2 2) defined in the classical Tikhonov regularization. Then, a feasible two-step strategy is proposed for selecting regularization parameter and balance coefficient defined in the improved penalty function. Finally, both numerical simulations on a simply-supported beam and laboratory experiments on a hollow tube beam are performed for assessing the accuracy and the feasibility of the proposed method. The illustrated results show that the moving forces can be accurately identified with a strong robustness. Some related issues, such as selection of moving window length, effect of different penalty functions, and effect of different car speeds, are discussed as well.

  18. PERAMALAN DERET WAKTU MENGGUNAKAN MODEL FUNGSI BASIS RADIAL (RBF DAN AUTO REGRESSIVE INTEGRATED MOVING AVERAGE (ARIMA

    Directory of Open Access Journals (Sweden)

    DT Wiyanti

    2013-07-01

    Full Text Available Salah satu metode peramalan yang paling dikembangkan saat ini adalah time series, yakni menggunakan pendekatan kuantitatif dengan data masa lampau yang dijadikan acuan untuk peramalan masa depan. Berbagai penelitian telah mengusulkan metode-metode untuk menyelesaikan time series, di antaranya statistik, jaringan syaraf, wavelet, dan sistem fuzzy. Metode-metode tersebut memiliki kekurangan dan keunggulan yang berbeda. Namun permasalahan yang ada dalam dunia nyata merupakan masalah yang kompleks. Satu metode saja mungkin tidak mampu mengatasi masalah tersebut dengan baik. Dalam artikel ini dibahas penggabungan dua buah metode yaitu Auto Regressive Integrated Moving Average (ARIMA dan Radial Basis Function (RBF. Alasan penggabungan kedua metode ini adalah karena adanya asumsi bahwa metode tunggal tidak dapat secara total mengidentifikasi semua karakteristik time series. Pada artikel ini dibahas peramalan terhadap data Indeks Harga Perdagangan Besar (IHPB dan data inflasi komoditi Indonesia; kedua data berada pada rentang tahun 2006 hingga beberapa bulan di tahun 2012. Kedua data tersebut masing-masing memiliki enam variabel. Hasil peramalan metode ARIMA-RBF dibandingkan dengan metode ARIMA dan metode RBF secara individual. Hasil analisa menunjukkan bahwa dengan metode penggabungan ARIMA dan RBF, model yang diberikan memiliki hasil yang lebih akurat dibandingkan dengan penggunaan salah satu metode saja. Hal ini terlihat dalam visual plot, MAPE, dan RMSE dari semua variabel pada dua data uji coba. The accuracy of time series forecasting is the subject of many decision-making processes. Time series use a quantitative approach to employ data from the past to make forecast for the future. Many researches have proposed several methods to solve time series, such as using statistics, neural networks, wavelets, and fuzzy systems. These methods have different advantages and disadvantages. But often the problem in the real world is just too complex that a

  19. Forecasting Construction Tender Price Index in Ghana using Autoregressive Integrated Moving Average with Exogenous Variables Model

    Directory of Open Access Journals (Sweden)

    Ernest Kissi

    2018-03-01

    Full Text Available Prices of construction resources keep on fluctuating due to unstable economic situations that have been experienced over the years. Clients knowledge of their financial commitments toward their intended project remains the basis for their final decision. The use of construction tender price index provides a realistic estimate at the early stage of the project. Tender price index (TPI is influenced by various economic factors, hence there are several statistical techniques that have been employed in forecasting. Some of these include regression, time series, vector error correction among others. However, in recent times the integrated modelling approach is gaining popularity due to its ability to give powerful predictive accuracy. Thus, in line with this assumption, the aim of this study is to apply autoregressive integrated moving average with exogenous variables (ARIMAX in modelling TPI. The results showed that ARIMAX model has a better predictive ability than the use of the single approach. The study further confirms the earlier position of previous research of the need to use the integrated model technique in forecasting TPI. This model will assist practitioners to forecast the future values of tender price index. Although the study focuses on the Ghanaian economy, the findings can be broadly applicable to other developing countries which share similar economic characteristics.

  20. Quantified moving average strategy of crude oil futures market based on fuzzy logic rules and genetic algorithms

    Science.gov (United States)

    Liu, Xiaojia; An, Haizhong; Wang, Lijun; Guan, Qing

    2017-09-01

    The moving average strategy is a technical indicator that can generate trading signals to assist investment. While the trading signals tell the traders timing to buy or sell, the moving average cannot tell the trading volume, which is a crucial factor for investment. This paper proposes a fuzzy moving average strategy, in which the fuzzy logic rule is used to determine the strength of trading signals, i.e., the trading volume. To compose one fuzzy logic rule, we use four types of moving averages, the length of the moving average period, the fuzzy extent, and the recommend value. Ten fuzzy logic rules form a fuzzy set, which generates a rating level that decides the trading volume. In this process, we apply genetic algorithms to identify an optimal fuzzy logic rule set and utilize crude oil futures prices from the New York Mercantile Exchange (NYMEX) as the experiment data. Each experiment is repeated for 20 times. The results show that firstly the fuzzy moving average strategy can obtain a more stable rate of return than the moving average strategies. Secondly, holding amounts series is highly sensitive to price series. Thirdly, simple moving average methods are more efficient. Lastly, the fuzzy extents of extremely low, high, and very high are more popular. These results are helpful in investment decisions.

  1. on the performance of Autoregressive Moving Average Polynomial

    African Journals Online (AJOL)

    Timothy Ademakinwa

    Distributed Lag (PDL) model, Autoregressive Polynomial Distributed Lag ... Moving Average Polynomial Distributed Lag (ARMAPDL) model. ..... Global Journal of Mathematics and Statistics. Vol. 1. ... Business and Economic Research Center.

  2. A dynamic analysis of moving average rules

    NARCIS (Netherlands)

    Chiarella, C.; He, X.Z.; Hommes, C.H.

    2006-01-01

    The use of various moving average (MA) rules remains popular with financial market practitioners. These rules have recently become the focus of a number empirical studies, but there have been very few studies of financial market models where some agents employ technical trading rules of the type

  3. Assessing the Efficacy of Adjustable Moving Averages Using ASEAN-5 Currencies.

    Directory of Open Access Journals (Sweden)

    Jacinta Chan Phooi M'ng

    Full Text Available The objective of this research is to examine the trends in the exchange rate markets of the ASEAN-5 countries (Indonesia (IDR, Malaysia (MYR, the Philippines (PHP, Singapore (SGD, and Thailand (THB through the application of dynamic moving average trading systems. This research offers evidence of the usefulness of the time-varying volatility technical analysis indicator, Adjustable Moving Average (AMA' in deciphering trends in these ASEAN-5 exchange rate markets. This time-varying volatility factor, referred to as the Efficacy Ratio in this paper, is embedded in AMA'. The Efficacy Ratio adjusts the AMA' to the prevailing market conditions by avoiding whipsaws (losses due, in part, to acting on wrong trading signals, which generally occur when there is no general direction in the market in range trading and by entering early into new trends in trend trading. The efficacy of AMA' is assessed against other popular moving-average rules. Based on the January 2005 to December 2014 dataset, our findings show that the moving averages and AMA' are superior to the passive buy-and-hold strategy. Specifically, AMA' outperforms the other models for the United States Dollar against PHP (USD/PHP and USD/THB currency pairs. The results show that different length moving averages perform better in different periods for the five currencies. This is consistent with our hypothesis that a dynamic adjustable technical indicator is needed to cater for different periods in different markets.

  4. Assessing the Efficacy of Adjustable Moving Averages Using ASEAN-5 Currencies.

    Science.gov (United States)

    Chan Phooi M'ng, Jacinta; Zainudin, Rozaimah

    2016-01-01

    The objective of this research is to examine the trends in the exchange rate markets of the ASEAN-5 countries (Indonesia (IDR), Malaysia (MYR), the Philippines (PHP), Singapore (SGD), and Thailand (THB)) through the application of dynamic moving average trading systems. This research offers evidence of the usefulness of the time-varying volatility technical analysis indicator, Adjustable Moving Average (AMA') in deciphering trends in these ASEAN-5 exchange rate markets. This time-varying volatility factor, referred to as the Efficacy Ratio in this paper, is embedded in AMA'. The Efficacy Ratio adjusts the AMA' to the prevailing market conditions by avoiding whipsaws (losses due, in part, to acting on wrong trading signals, which generally occur when there is no general direction in the market) in range trading and by entering early into new trends in trend trading. The efficacy of AMA' is assessed against other popular moving-average rules. Based on the January 2005 to December 2014 dataset, our findings show that the moving averages and AMA' are superior to the passive buy-and-hold strategy. Specifically, AMA' outperforms the other models for the United States Dollar against PHP (USD/PHP) and USD/THB currency pairs. The results show that different length moving averages perform better in different periods for the five currencies. This is consistent with our hypothesis that a dynamic adjustable technical indicator is needed to cater for different periods in different markets.

  5. Modified Exponential Weighted Moving Average (EWMA) Control Chart on Autocorrelation Data

    Science.gov (United States)

    Herdiani, Erna Tri; Fandrilla, Geysa; Sunusi, Nurtiti

    2018-03-01

    In general, observations of the statistical process control are assumed to be mutually independence. However, this assumption is often violated in practice. Consequently, statistical process controls were developed for interrelated processes, including Shewhart, Cumulative Sum (CUSUM), and exponentially weighted moving average (EWMA) control charts in the data that were autocorrelation. One researcher stated that this chart is not suitable if the same control limits are used in the case of independent variables. For this reason, it is necessary to apply the time series model in building the control chart. A classical control chart for independent variables is usually applied to residual processes. This procedure is permitted provided that residuals are independent. In 1978, Shewhart modification for the autoregressive process was introduced by using the distance between the sample mean and the target value compared to the standard deviation of the autocorrelation process. In this paper we will examine the mean of EWMA for autocorrelation process derived from Montgomery and Patel. Performance to be investigated was investigated by examining Average Run Length (ARL) based on the Markov Chain Method.

  6. Short-term electricity prices forecasting based on support vector regression and Auto-regressive integrated moving average modeling

    International Nuclear Information System (INIS)

    Che Jinxing; Wang Jianzhou

    2010-01-01

    In this paper, we present the use of different mathematical models to forecast electricity price under deregulated power. A successful prediction tool of electricity price can help both power producers and consumers plan their bidding strategies. Inspired by that the support vector regression (SVR) model, with the ε-insensitive loss function, admits of the residual within the boundary values of ε-tube, we propose a hybrid model that combines both SVR and Auto-regressive integrated moving average (ARIMA) models to take advantage of the unique strength of SVR and ARIMA models in nonlinear and linear modeling, which is called SVRARIMA. A nonlinear analysis of the time-series indicates the convenience of nonlinear modeling, the SVR is applied to capture the nonlinear patterns. ARIMA models have been successfully applied in solving the residuals regression estimation problems. The experimental results demonstrate that the model proposed outperforms the existing neural-network approaches, the traditional ARIMA models and other hybrid models based on the root mean square error and mean absolute percentage error.

  7. A note on moving average models for Gaussian random fields

    DEFF Research Database (Denmark)

    Hansen, Linda Vadgård; Thorarinsdottir, Thordis L.

    The class of moving average models offers a flexible modeling framework for Gaussian random fields with many well known models such as the Matérn covariance family and the Gaussian covariance falling under this framework. Moving average models may also be viewed as a kernel smoothing of a Lévy...... basis, a general modeling framework which includes several types of non-Gaussian models. We propose a new one-parameter spatial correlation model which arises from a power kernel and show that the associated Hausdorff dimension of the sample paths can take any value between 2 and 3. As a result...

  8. Long-Term Prediction of Emergency Department Revenue and Visitor Volume Using Autoregressive Integrated Moving Average Model

    Directory of Open Access Journals (Sweden)

    Chieh-Fan Chen

    2011-01-01

    Full Text Available This study analyzed meteorological, clinical and economic factors in terms of their effects on monthly ED revenue and visitor volume. Monthly data from January 1, 2005 to September 30, 2009 were analyzed. Spearman correlation and cross-correlation analyses were performed to identify the correlation between each independent variable, ED revenue, and visitor volume. Autoregressive integrated moving average (ARIMA model was used to quantify the relationship between each independent variable, ED revenue, and visitor volume. The accuracies were evaluated by comparing model forecasts to actual values with mean absolute percentage of error. Sensitivity of prediction errors to model training time was also evaluated. The ARIMA models indicated that mean maximum temperature, relative humidity, rainfall, non-trauma, and trauma visits may correlate positively with ED revenue, but mean minimum temperature may correlate negatively with ED revenue. Moreover, mean minimum temperature and stock market index fluctuation may correlate positively with trauma visitor volume. Mean maximum temperature, relative humidity and stock market index fluctuation may correlate positively with non-trauma visitor volume. Mean maximum temperature and relative humidity may correlate positively with pediatric visitor volume, but mean minimum temperature may correlate negatively with pediatric visitor volume. The model also performed well in forecasting revenue and visitor volume.

  9. Moving average rules as a source of market instability

    NARCIS (Netherlands)

    Chiarella, C.; He, X.Z.; Hommes, C.H.

    2006-01-01

    Despite the pervasiveness of the efficient markets paradigm in the academic finance literature, the use of various moving average (MA) trading rules remains popular with financial market practitioners. This paper proposes a stochastic dynamic financial market model in which demand for traded assets

  10. Application of autoregressive moving average model in reactor noise analysis

    International Nuclear Information System (INIS)

    Tran Dinh Tri

    1993-01-01

    The application of an autoregressive (AR) model to estimating noise measurements has achieved many successes in reactor noise analysis in the last ten years. The physical processes that take place in the nuclear reactor, however, are described by an autoregressive moving average (ARMA) model rather than by an AR model. Consequently more correct results could be obtained by applying the ARMA model instead of the AR model to reactor noise analysis. In this paper the system of the generalised Yule-Walker equations is derived from the equation of an ARMA model, then a method for its solution is given. Numerical results show the applications of the method proposed. (author)

  11. [The trial of business data analysis at the Department of Radiology by constructing the auto-regressive integrated moving-average (ARIMA) model].

    Science.gov (United States)

    Tani, Yuji; Ogasawara, Katsuhiko

    2012-01-01

    This study aimed to contribute to the management of a healthcare organization by providing management information using time-series analysis of business data accumulated in the hospital information system, which has not been utilized thus far. In this study, we examined the performance of the prediction method using the auto-regressive integrated moving-average (ARIMA) model, using the business data obtained at the Radiology Department. We made the model using the data used for analysis, which was the number of radiological examinations in the past 9 years, and we predicted the number of radiological examinations in the last 1 year. Then, we compared the actual value with the forecast value. We were able to establish that the performance prediction method was simple and cost-effective by using free software. In addition, we were able to build the simple model by pre-processing the removal of trend components using the data. The difference between predicted values and actual values was 10%; however, it was more important to understand the chronological change rather than the individual time-series values. Furthermore, our method was highly versatile and adaptable compared to the general time-series data. Therefore, different healthcare organizations can use our method for the analysis and forecasting of their business data.

  12. Kumaraswamy autoregressive moving average models for double bounded environmental data

    Science.gov (United States)

    Bayer, Fábio Mariano; Bayer, Débora Missio; Pumi, Guilherme

    2017-12-01

    In this paper we introduce the Kumaraswamy autoregressive moving average models (KARMA), which is a dynamic class of models for time series taking values in the double bounded interval (a,b) following the Kumaraswamy distribution. The Kumaraswamy family of distribution is widely applied in many areas, especially hydrology and related fields. Classical examples are time series representing rates and proportions observed over time. In the proposed KARMA model, the median is modeled by a dynamic structure containing autoregressive and moving average terms, time-varying regressors, unknown parameters and a link function. We introduce the new class of models and discuss conditional maximum likelihood estimation, hypothesis testing inference, diagnostic analysis and forecasting. In particular, we provide closed-form expressions for the conditional score vector and conditional Fisher information matrix. An application to environmental real data is presented and discussed.

  13. An Exponentially Weighted Moving Average Control Chart for Bernoulli Data

    DEFF Research Database (Denmark)

    Spliid, Henrik

    2010-01-01

    of the transformation is given and its limit for small values of p is derived. Control of high yield processes is discussed and the chart is shown to perform very well in comparison with both the most common alternative EWMA chart and the CUSUM chart. The construction and the use of the proposed EWMA chart......We consider a production process in which units are produced in a sequential manner. The units can, for example, be manufactured items or services, provided to clients. Each unit produced can be a failure with probability p or a success (non-failure) with probability (1-p). A novel exponentially...... weighted moving average (EWMA) control chart intended for surveillance of the probability of failure, p, is described. The chart is based on counting the number of non-failures produced between failures in combination with a variance-stabilizing transformation. The distribution function...

  14. A RED modified weighted moving average for soft real-time application

    Directory of Open Access Journals (Sweden)

    Domanśka Joanna

    2014-09-01

    Full Text Available The popularity of TCP/IP has resulted in an increase in usage of best-effort networks for real-time communication. Much effort has been spent to ensure quality of service for soft real-time traffic over IP networks. The Internet Engineering Task Force has proposed some architecture components, such as Active Queue Management (AQM. The paper investigates the influence of the weighted moving average on packet waiting time reduction for an AQM mechanism: the RED algorithm. The proposed method for computing the average queue length is based on a difference equation (a recursive equation. Depending on a particular optimality criterion, proper parameters of the modified weighted moving average function can be chosen. This change will allow reducing the number of violations of timing constraints and better use of this mechanism for soft real-time transmissions. The optimization problem is solved through simulations performed in OMNeT++ and later verified experimentally on a Linux implementation

  15. Bivariate copulas on the exponentially weighted moving average control chart

    Directory of Open Access Journals (Sweden)

    Sasigarn Kuvattana

    2016-10-01

    Full Text Available This paper proposes four types of copulas on the Exponentially Weighted Moving Average (EWMA control chart when observations are from an exponential distribution using a Monte Carlo simulation approach. The performance of the control chart is based on the Average Run Length (ARL which is compared for each copula. Copula functions for specifying dependence between random variables are used and measured by Kendall’s tau. The results show that the Normal copula can be used for almost all shifts.

  16. Making the Move: A Mixed Research Integrative Review

    Directory of Open Access Journals (Sweden)

    Sarah Gilbert

    2015-08-01

    Full Text Available The purpose of this mixed research integrative review is to determine factors that influence relocation transitions for older adults who are considering a move from independent living to supervised housing, such as assisted living, using the Theory of Planned Behavior as a conceptual guide. PubMED, CINAHL, and PsychInfo databases were queried using key words: relocation, transition, older adults, and, elderly and time limited from 1992 to 2014. Sixteen articles were retained for review. The majority of articles, qualitative in design, reveal that older adults who comprehend the need to move and participate in the decision-making process of a relocation adjust to new living environments with fewer negative outcomes than older adults who experience a forced relocation. The few quantitative articles examined the elements of impending relocation using a variety of instruments but support the necessity for older adults to recognize the possibility of a future move and contribute to the relocation process. Additionally, the influence of family, friends, and health care providers provides the older adult with support and guidance throughout the process.

  17. Effect of parameters in moving average method for event detection enhancement using phase sensitive OTDR

    Science.gov (United States)

    Kwon, Yong-Seok; Naeem, Khurram; Jeon, Min Yong; Kwon, Il-bum

    2017-04-01

    We analyze the relations of parameters in moving average method to enhance the event detectability of phase sensitive optical time domain reflectometer (OTDR). If the external events have unique frequency of vibration, then the control parameters of moving average method should be optimized in order to detect these events efficiently. A phase sensitive OTDR was implemented by a pulsed light source, which is composed of a laser diode, a semiconductor optical amplifier, an erbium-doped fiber amplifier, a fiber Bragg grating filter, and a light receiving part, which has a photo-detector and high speed data acquisition system. The moving average method is operated with the control parameters: total number of raw traces, M, number of averaged traces, N, and step size of moving, n. The raw traces are obtained by the phase sensitive OTDR with sound signals generated by a speaker. Using these trace data, the relation of the control parameters is analyzed. In the result, if the event signal has one frequency, then the optimal values of N, n are existed to detect the event efficiently.

  18. Image compression using moving average histogram and RBF network

    International Nuclear Information System (INIS)

    Khowaja, S.; Ismaili, I.A.

    2015-01-01

    Modernization and Globalization have made the multimedia technology as one of the fastest growing field in recent times but optimal use of bandwidth and storage has been one of the topics which attract the research community to work on. Considering that images have a lion share in multimedia communication, efficient image compression technique has become the basic need for optimal use of bandwidth and space. This paper proposes a novel method for image compression based on fusion of moving average histogram and RBF (Radial Basis Function). Proposed technique employs the concept of reducing color intensity levels using moving average histogram technique followed by the correction of color intensity levels using RBF networks at reconstruction phase. Existing methods have used low resolution images for the testing purpose but the proposed method has been tested on various image resolutions to have a clear assessment of the said technique. The proposed method have been tested on 35 images with varying resolution and have been compared with the existing algorithms in terms of CR (Compression Ratio), MSE (Mean Square Error), PSNR (Peak Signal to Noise Ratio), computational complexity. The outcome shows that the proposed methodology is a better trade off technique in terms of compression ratio, PSNR which determines the quality of the image and computational complexity. (author)

  19. Averaging, not internal noise, limits the development of coherent motion processing

    Directory of Open Access Journals (Sweden)

    Catherine Manning

    2014-10-01

    Full Text Available The development of motion processing is a critical part of visual development, allowing children to interact with moving objects and navigate within a dynamic environment. However, global motion processing, which requires pooling motion information across space, develops late, reaching adult-like levels only by mid-to-late childhood. The reasons underlying this protracted development are not yet fully understood. In this study, we sought to determine whether the development of motion coherence sensitivity is limited by internal noise (i.e., imprecision in estimating the directions of individual elements and/or global pooling across local estimates. To this end, we presented equivalent noise direction discrimination tasks and motion coherence tasks at both slow (1.5°/s and fast (6°/s speeds to children aged 5, 7, 9 and 11 years, and adults. We show that, as children get older, their levels of internal noise reduce, and they are able to average across more local motion estimates. Regression analyses indicated, however, that age-related improvements in coherent motion perception are driven solely by improvements in averaging and not by reductions in internal noise. Our results suggest that the development of coherent motion sensitivity is primarily limited by developmental changes within brain regions involved in integrating motion signals (e.g., MT/V5.

  20. Spatial analysis based on variance of moving window averages

    OpenAIRE

    Wu, B M; Subbarao, K V; Ferrandino, F J; Hao, J J

    2006-01-01

    A new method for analysing spatial patterns was designed based on the variance of moving window averages (VMWA), which can be directly calculated in geographical information systems or a spreadsheet program (e.g. MS Excel). Different types of artificial data were generated to test the method. Regardless of data types, the VMWA method correctly determined the mean cluster sizes. This method was also employed to assess spatial patterns in historical plant disease survey data encompassing both a...

  1. A Pareto-optimal moving average multigene genetic programming model for daily streamflow prediction

    Science.gov (United States)

    Danandeh Mehr, Ali; Kahya, Ercan

    2017-06-01

    Genetic programming (GP) is able to systematically explore alternative model structures of different accuracy and complexity from observed input and output data. The effectiveness of GP in hydrological system identification has been recognized in recent studies. However, selecting a parsimonious (accurate and simple) model from such alternatives still remains a question. This paper proposes a Pareto-optimal moving average multigene genetic programming (MA-MGGP) approach to develop a parsimonious model for single-station streamflow prediction. The three main components of the approach that take us from observed data to a validated model are: (1) data pre-processing, (2) system identification and (3) system simplification. The data pre-processing ingredient uses a simple moving average filter to diminish the lagged prediction effect of stand-alone data-driven models. The multigene ingredient of the model tends to identify the underlying nonlinear system with expressions simpler than classical monolithic GP and, eventually simplification component exploits Pareto front plot to select a parsimonious model through an interactive complexity-efficiency trade-off. The approach was tested using the daily streamflow records from a station on Senoz Stream, Turkey. Comparing to the efficiency results of stand-alone GP, MGGP, and conventional multi linear regression prediction models as benchmarks, the proposed Pareto-optimal MA-MGGP model put forward a parsimonious solution, which has a noteworthy importance of being applied in practice. In addition, the approach allows the user to enter human insight into the problem to examine evolved models and pick the best performing programs out for further analysis.

  2. Modelling and analysis of turbulent datasets using Auto Regressive Moving Average processes

    International Nuclear Information System (INIS)

    Faranda, Davide; Dubrulle, Bérengère; Daviaud, François; Pons, Flavio Maria Emanuele; Saint-Michel, Brice; Herbert, Éric; Cortet, Pierre-Philippe

    2014-01-01

    We introduce a novel way to extract information from turbulent datasets by applying an Auto Regressive Moving Average (ARMA) statistical analysis. Such analysis goes well beyond the analysis of the mean flow and of the fluctuations and links the behavior of the recorded time series to a discrete version of a stochastic differential equation which is able to describe the correlation structure in the dataset. We introduce a new index Υ that measures the difference between the resulting analysis and the Obukhov model of turbulence, the simplest stochastic model reproducing both Richardson law and the Kolmogorov spectrum. We test the method on datasets measured in a von Kármán swirling flow experiment. We found that the ARMA analysis is well correlated with spatial structures of the flow, and can discriminate between two different flows with comparable mean velocities, obtained by changing the forcing. Moreover, we show that the Υ is highest in regions where shear layer vortices are present, thereby establishing a link between deviations from the Kolmogorov model and coherent structures. These deviations are consistent with the ones observed by computing the Hurst exponents for the same time series. We show that some salient features of the analysis are preserved when considering global instead of local observables. Finally, we analyze flow configurations with multistability features where the ARMA technique is efficient in discriminating different stability branches of the system

  3. Using exponentially weighted moving average algorithm to defend against DDoS attacks

    CSIR Research Space (South Africa)

    Machaka, P

    2016-11-01

    Full Text Available This paper seeks to investigate the performance of the Exponentially Weighted Moving Average (EWMA) for mining big data and detection of DDoS attacks in Internet of Things (IoT) infrastructure. The paper will investigate the tradeoff between...

  4. Generalized Heteroskedasticity ACF for Moving Average Models in Explicit Forms

    Directory of Open Access Journals (Sweden)

    Samir Khaled Safi

    2014-02-01

    Full Text Available Normal 0 false false false MicrosoftInternetExplorer4 The autocorrelation function (ACF measures the correlation between observations at different   distances apart. We derive explicit equations for generalized heteroskedasticity ACF for moving average of order q, MA(q. We consider two cases: Firstly: when the disturbance term follow the general covariance matrix structure Cov(wi, wj=S with si,j ¹ 0 " i¹j . Secondly: when the diagonal elements of S are not all identical but sij = 0 " i¹j, i.e. S=diag(s11, s22,…,stt. The forms of the explicit equations depend essentially on the moving average coefficients and covariance structure of the disturbance terms.   /* Style Definitions */ table.MsoNormalTable {mso-style-name:"جدول عادي"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin:0cm; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Times New Roman"; mso-ansi-language:#0400; mso-fareast-language:#0400; mso-bidi-language:#0400;}

  5. The dynamics of multimodal integration: The averaging diffusion model.

    Science.gov (United States)

    Turner, Brandon M; Gao, Juan; Koenig, Scott; Palfy, Dylan; L McClelland, James

    2017-12-01

    We combine extant theories of evidence accumulation and multi-modal integration to develop an integrated framework for modeling multimodal integration as a process that unfolds in real time. Many studies have formulated sensory processing as a dynamic process where noisy samples of evidence are accumulated until a decision is made. However, these studies are often limited to a single sensory modality. Studies of multimodal stimulus integration have focused on how best to combine different sources of information to elicit a judgment. These studies are often limited to a single time point, typically after the integration process has occurred. We address these limitations by combining the two approaches. Experimentally, we present data that allow us to study the time course of evidence accumulation within each of the visual and auditory domains as well as in a bimodal condition. Theoretically, we develop a new Averaging Diffusion Model in which the decision variable is the mean rather than the sum of evidence samples and use it as a base for comparing three alternative models of multimodal integration, allowing us to assess the optimality of this integration. The outcome reveals rich individual differences in multimodal integration: while some subjects' data are consistent with adaptive optimal integration, reweighting sources of evidence as their relative reliability changes during evidence integration, others exhibit patterns inconsistent with optimality.

  6. Estimation and Forecasting in Vector Autoregressive Moving Average Models for Rich Datasets

    DEFF Research Database (Denmark)

    Dias, Gustavo Fruet; Kapetanios, George

    We address the issue of modelling and forecasting macroeconomic variables using rich datasets, by adopting the class of Vector Autoregressive Moving Average (VARMA) models. We overcome the estimation issue that arises with this class of models by implementing an iterative ordinary least squares (...

  7. Compact and accurate linear and nonlinear autoregressive moving average model parameter estimation using laguerre functions

    DEFF Research Database (Denmark)

    Chon, K H; Cohen, R J; Holstein-Rathlou, N H

    1997-01-01

    A linear and nonlinear autoregressive moving average (ARMA) identification algorithm is developed for modeling time series data. The algorithm uses Laguerre expansion of kernals (LEK) to estimate Volterra-Wiener kernals. However, instead of estimating linear and nonlinear system dynamics via moving...... average models, as is the case for the Volterra-Wiener analysis, we propose an ARMA model-based approach. The proposed algorithm is essentially the same as LEK, but this algorithm is extended to include past values of the output as well. Thus, all of the advantages associated with using the Laguerre...

  8. Forecast of sea surface temperature off the Peruvian coast using an autoregressive integrated moving average model

    Directory of Open Access Journals (Sweden)

    Carlos Quispe

    2013-04-01

    Full Text Available El Niño connects globally climate, ecosystems and socio-economic activities. Since 1980 this event has been tried to be predicted, but until now the statistical and dynamical models are insuffi cient. Thus, the objective of the present work was to explore using an autoregressive moving average model the effect of El Niño over the sea surface temperature (TSM off the Peruvian coast. The work involved 5 stages: identifi cation, estimation, diagnostic checking, forecasting and validation. Simple and partial autocorrelation functions (FAC and FACP were used to identify and reformulate the orders of the model parameters, as well as Akaike information criterium (AIC and Schwarz criterium (SC for the selection of the best models during the diagnostic checking. Among the main results the models ARIMA(12,0,11 were proposed, which simulated monthly conditions in agreement with the observed conditions off the Peruvian coast: cold conditions at the end of 2004, and neutral conditions at the beginning of 2005.

  9. Generalized Heteroskedasticity ACF for Moving Average Models in Explicit Forms

    OpenAIRE

    Samir Khaled Safi

    2014-01-01

    The autocorrelation function (ACF) measures the correlation between observations at different   distances apart. We derive explicit equations for generalized heteroskedasticity ACF for moving average of order q, MA(q). We consider two cases: Firstly: when the disturbance term follow the general covariance matrix structure Cov(wi, wj)=S with si,j ¹ 0 " i¹j . Secondly: when the diagonal elements of S are not all identical but sij = 0 " i¹j, i.e. S=diag(s11, s22,&hellip...

  10. Middle and long-term prediction of UT1-UTC based on combination of Gray Model and Autoregressive Integrated Moving Average

    Science.gov (United States)

    Jia, Song; Xu, Tian-he; Sun, Zhang-zhen; Li, Jia-jing

    2017-02-01

    UT1-UTC is an important part of the Earth Orientation Parameters (EOP). The high-precision predictions of UT1-UTC play a key role in practical applications of deep space exploration, spacecraft tracking and satellite navigation and positioning. In this paper, a new prediction method with combination of Gray Model (GM(1, 1)) and Autoregressive Integrated Moving Average (ARIMA) is developed. The main idea is as following. Firstly, the UT1-UTC data are preprocessed by removing the leap second and Earth's zonal harmonic tidal to get UT1R-TAI data. Periodic terms are estimated and removed by the least square to get UT2R-TAI. Then the linear terms of UT2R-TAI data are modeled by the GM(1, 1), and the residual terms are modeled by the ARIMA. Finally, the UT2R-TAI prediction can be performed based on the combined model of GM(1, 1) and ARIMA, and the UT1-UTC predictions are obtained by adding the corresponding periodic terms, leap second correction and the Earth's zonal harmonic tidal correction. The results show that the proposed model can be used to predict UT1-UTC effectively with higher middle and long-term (from 32 to 360 days) accuracy than those of LS + AR, LS + MAR and WLS + MAR.

  11. Experimental investigation of a moving averaging algorithm for motion perpendicular to the leaf travel direction in dynamic MLC target tracking

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, Jai-Woong; Sawant, Amit; Suh, Yelin; Cho, Byung-Chul; Suh, Tae-Suk; Keall, Paul [Department of Biomedical Engineering, College of Medicine, Catholic University of Korea, Seoul, Korea 131-700 and Research Institute of Biomedical Engineering, Catholic University of Korea, Seoul, 131-700 (Korea, Republic of); Department of Radiation Oncology, Stanford University, Stanford, California 94305 (United States); Department of Radiation Oncology, Stanford University, Stanford, California 94305 (United States) and Department of Radiation Oncology, Asan Medical Center, Seoul, 138-736 (Korea, Republic of); Department of Biomedical Engineering, College of Medicine, Catholic University of Korea, Seoul, 131-700 and Research Institute of Biomedical Engineering, Catholic University of Korea, Seoul, 131-700 (Korea, Republic of); Department of Radiation Oncology, Stanford University, Stanford, California 94305 (United States) and Radiation Physics Laboratory, Sydney Medical School, University of Sydney, 2006 (Australia)

    2011-07-15

    Purpose: In dynamic multileaf collimator (MLC) motion tracking with complex intensity-modulated radiation therapy (IMRT) fields, target motion perpendicular to the MLC leaf travel direction can cause beam holds, which increase beam delivery time by up to a factor of 4. As a means to balance delivery efficiency and accuracy, a moving average algorithm was incorporated into a dynamic MLC motion tracking system (i.e., moving average tracking) to account for target motion perpendicular to the MLC leaf travel direction. The experimental investigation of the moving average algorithm compared with real-time tracking and no compensation beam delivery is described. Methods: The properties of the moving average algorithm were measured and compared with those of real-time tracking (dynamic MLC motion tracking accounting for both target motion parallel and perpendicular to the leaf travel direction) and no compensation beam delivery. The algorithm was investigated using a synthetic motion trace with a baseline drift and four patient-measured 3D tumor motion traces representing regular and irregular motions with varying baseline drifts. Each motion trace was reproduced by a moving platform. The delivery efficiency, geometric accuracy, and dosimetric accuracy were evaluated for conformal, step-and-shoot IMRT, and dynamic sliding window IMRT treatment plans using the synthetic and patient motion traces. The dosimetric accuracy was quantified via a {gamma}-test with a 3%/3 mm criterion. Results: The delivery efficiency ranged from 89 to 100% for moving average tracking, 26%-100% for real-time tracking, and 100% (by definition) for no compensation. The root-mean-square geometric error ranged from 3.2 to 4.0 mm for moving average tracking, 0.7-1.1 mm for real-time tracking, and 3.7-7.2 mm for no compensation. The percentage of dosimetric points failing the {gamma}-test ranged from 4 to 30% for moving average tracking, 0%-23% for real-time tracking, and 10%-47% for no compensation

  12. Experimental investigation of a moving averaging algorithm for motion perpendicular to the leaf travel direction in dynamic MLC target tracking.

    Science.gov (United States)

    Yoon, Jai-Woong; Sawant, Amit; Suh, Yelin; Cho, Byung-Chul; Suh, Tae-Suk; Keall, Paul

    2011-07-01

    In dynamic multileaf collimator (MLC) motion tracking with complex intensity-modulated radiation therapy (IMRT) fields, target motion perpendicular to the MLC leaf travel direction can cause beam holds, which increase beam delivery time by up to a factor of 4. As a means to balance delivery efficiency and accuracy, a moving average algorithm was incorporated into a dynamic MLC motion tracking system (i.e., moving average tracking) to account for target motion perpendicular to the MLC leaf travel direction. The experimental investigation of the moving average algorithm compared with real-time tracking and no compensation beam delivery is described. The properties of the moving average algorithm were measured and compared with those of real-time tracking (dynamic MLC motion tracking accounting for both target motion parallel and perpendicular to the leaf travel direction) and no compensation beam delivery. The algorithm was investigated using a synthetic motion trace with a baseline drift and four patient-measured 3D tumor motion traces representing regular and irregular motions with varying baseline drifts. Each motion trace was reproduced by a moving platform. The delivery efficiency, geometric accuracy, and dosimetric accuracy were evaluated for conformal, step-and-shoot IMRT, and dynamic sliding window IMRT treatment plans using the synthetic and patient motion traces. The dosimetric accuracy was quantified via a tgamma-test with a 3%/3 mm criterion. The delivery efficiency ranged from 89 to 100% for moving average tracking, 26%-100% for real-time tracking, and 100% (by definition) for no compensation. The root-mean-square geometric error ranged from 3.2 to 4.0 mm for moving average tracking, 0.7-1.1 mm for real-time tracking, and 3.7-7.2 mm for no compensation. The percentage of dosimetric points failing the gamma-test ranged from 4 to 30% for moving average tracking, 0%-23% for real-time tracking, and 10%-47% for no compensation. The delivery efficiency of

  13. Exponentially Weighted Moving Average Chart as a Suitable Tool for Nuchal Translucency Quality Review

    Czech Academy of Sciences Publication Activity Database

    Hynek, M.; Smetanová, D.; Stejskal, D.; Zvárová, Jana

    2014-01-01

    Roč. 34, č. 4 (2014), s. 367-376 ISSN 0197-3851 Institutional support: RVO:67985807 Keywords : nuchal translucency * exponentially weighted moving average model * statistics Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 3.268, year: 2014

  14. Autoregressive moving average fitting for real standard deviation in Monte Carlo power distribution calculation

    International Nuclear Information System (INIS)

    Ueki, Taro

    2010-01-01

    The noise propagation of tallies in the Monte Carlo power method can be represented by the autoregressive moving average process of orders p and p-1 (ARMA(p,p-1)], where p is an integer larger than or equal to two. The formula of the autocorrelation of ARMA(p,q), p≥q+1, indicates that ARMA(3,2) fitting is equivalent to lumping the eigenmodes of fluctuation propagation in three modes such as the slow, intermediate and fast attenuation modes. Therefore, ARMA(3,2) fitting was applied to the real standard deviation estimation of fuel assemblies at particular heights. The numerical results show that straightforward ARMA(3,2) fitting is promising but a stability issue must be resolved toward the incorporation in the distributed version of production Monte Carlo codes. The same numerical results reveal that the average performance of ARMA(3,2) fitting is equivalent to that of the batch method in MCNP with a batch size larger than one hundred and smaller than two hundred cycles for a 1100 MWe pressurized water reactor. The bias correction of low lag autocovariances in MVP/GMVP is demonstrated to have the potential of improving the average performance of ARMA(3,2) fitting. (author)

  15. Analysis of nonlinear systems using ARMA [autoregressive moving average] models

    International Nuclear Information System (INIS)

    Hunter, N.F. Jr.

    1990-01-01

    While many vibration systems exhibit primarily linear behavior, a significant percentage of the systems encountered in vibration and model testing are mildly to severely nonlinear. Analysis methods for such nonlinear systems are not yet well developed and the response of such systems is not accurately predicted by linear models. Nonlinear ARMA (autoregressive moving average) models are one method for the analysis and response prediction of nonlinear vibratory systems. In this paper we review the background of linear and nonlinear ARMA models, and illustrate the application of these models to nonlinear vibration systems. We conclude by summarizing the advantages and disadvantages of ARMA models and emphasizing prospects for future development. 14 refs., 11 figs

  16. Hybrid support vector regression and autoregressive integrated moving average models improved by particle swarm optimization for property crime rates forecasting with economic indicators.

    Science.gov (United States)

    Alwee, Razana; Shamsuddin, Siti Mariyam Hj; Sallehuddin, Roselina

    2013-01-01

    Crimes forecasting is an important area in the field of criminology. Linear models, such as regression and econometric models, are commonly applied in crime forecasting. However, in real crimes data, it is common that the data consists of both linear and nonlinear components. A single model may not be sufficient to identify all the characteristics of the data. The purpose of this study is to introduce a hybrid model that combines support vector regression (SVR) and autoregressive integrated moving average (ARIMA) to be applied in crime rates forecasting. SVR is very robust with small training data and high-dimensional problem. Meanwhile, ARIMA has the ability to model several types of time series. However, the accuracy of the SVR model depends on values of its parameters, while ARIMA is not robust to be applied to small data sets. Therefore, to overcome this problem, particle swarm optimization is used to estimate the parameters of the SVR and ARIMA models. The proposed hybrid model is used to forecast the property crime rates of the United State based on economic indicators. The experimental results show that the proposed hybrid model is able to produce more accurate forecasting results as compared to the individual models.

  17. Hybrid Support Vector Regression and Autoregressive Integrated Moving Average Models Improved by Particle Swarm Optimization for Property Crime Rates Forecasting with Economic Indicators

    Directory of Open Access Journals (Sweden)

    Razana Alwee

    2013-01-01

    Full Text Available Crimes forecasting is an important area in the field of criminology. Linear models, such as regression and econometric models, are commonly applied in crime forecasting. However, in real crimes data, it is common that the data consists of both linear and nonlinear components. A single model may not be sufficient to identify all the characteristics of the data. The purpose of this study is to introduce a hybrid model that combines support vector regression (SVR and autoregressive integrated moving average (ARIMA to be applied in crime rates forecasting. SVR is very robust with small training data and high-dimensional problem. Meanwhile, ARIMA has the ability to model several types of time series. However, the accuracy of the SVR model depends on values of its parameters, while ARIMA is not robust to be applied to small data sets. Therefore, to overcome this problem, particle swarm optimization is used to estimate the parameters of the SVR and ARIMA models. The proposed hybrid model is used to forecast the property crime rates of the United State based on economic indicators. The experimental results show that the proposed hybrid model is able to produce more accurate forecasting results as compared to the individual models.

  18. The Integration of Extrarational and Rational Learning Processes: Moving Towards the Whole Learner.

    Science.gov (United States)

    Puk, Tom

    1996-01-01

    Discusses the dichotomy between rational and nonrational learning processes, arguing for an integration of both. Reviews information processing theory and related learning strategies. Presents a model instructional strategy that fully integrates rational and nonrational processes. Describes implications for teaching and learning of the learning…

  19. Stable non-Gaussian self-similar processes with stationary increments

    CERN Document Server

    Pipiras, Vladas

    2017-01-01

    This book provides a self-contained presentation on the structure of a large class of stable processes, known as self-similar mixed moving averages. The authors present a way to describe and classify these processes by relating them to so-called deterministic flows. The first sections in the book review random variables, stochastic processes, and integrals, moving on to rigidity and flows, and finally ending with mixed moving averages and self-similarity. In-depth appendices are also included. This book is aimed at graduate students and researchers working in probability theory and statistics.

  20. Dosimetric consequences of planning lung treatments on 4DCT average reconstruction to represent a moving tumour

    International Nuclear Information System (INIS)

    Dunn, L.F.; Taylor, M.L.; Kron, T.; Franich, R.

    2010-01-01

    Full text: Anatomic motion during a radiotherapy treatment is one of the more significant challenges in contemporary radiation therapy. For tumours of the lung, motion due to patient respiration makes both accurate planning and dose delivery difficult. One approach is to use the maximum intensity projection (MIP) obtained from a 40 computed tomography (CT) scan and then use this to determine the treatment volume. The treatment is then planned on a 4DCT average reco struction, rather than assuming the entire ITY has a uniform tumour density. This raises the question: how well does planning on a 'blurred' distribution of density with CT values greater than lung density but less than tumour density match the true case of a tumour moving within lung tissue? The aim of this study was to answer this question, determining the dosimetric impact of using a 4D-CT average reconstruction as the basis for a radiotherapy treatment plan. To achieve this, Monte-Carlo sim ulations were undertaken using GEANT4. The geometry consisted of a tumour (diameter 30 mm) moving with a sinusoidal pattern of amplitude = 20 mm. The tumour's excursion occurs within a lung equivalent volume beyond a chest wall interface. Motion was defined parallel to a 6 MY beam. This was then compared to a single oblate tumour of a magnitude determined by the extremes of the tumour motion. The variable density of the 4DCT average tumour is simulated by a time-weighted average, to achieve the observed density gradient. The generic moving tumour geometry is illustrated in the Figure.

  1. Dual-component model of respiratory motion based on the periodic autoregressive moving average (periodic ARMA) method

    International Nuclear Information System (INIS)

    McCall, K C; Jeraj, R

    2007-01-01

    A new approach to the problem of modelling and predicting respiration motion has been implemented. This is a dual-component model, which describes the respiration motion as a non-periodic time series superimposed onto a periodic waveform. A periodic autoregressive moving average algorithm has been used to define a mathematical model of the periodic and non-periodic components of the respiration motion. The periodic components of the motion were found by projecting multiple inhale-exhale cycles onto a common subspace. The component of the respiration signal that is left after removing this periodicity is a partially autocorrelated time series and was modelled as an autoregressive moving average (ARMA) process. The accuracy of the periodic ARMA model with respect to fluctuation in amplitude and variation in length of cycles has been assessed. A respiration phantom was developed to simulate the inter-cycle variations seen in free-breathing and coached respiration patterns. At ±14% variability in cycle length and maximum amplitude of motion, the prediction errors were 4.8% of the total motion extent for a 0.5 s ahead prediction, and 9.4% at 1.0 s lag. The prediction errors increased to 11.6% at 0.5 s and 21.6% at 1.0 s when the respiration pattern had ±34% variations in both these parameters. Our results have shown that the accuracy of the periodic ARMA model is more strongly dependent on the variations in cycle length than the amplitude of the respiration cycles

  2. Prediction of Tourist Arrivals to the Island of Bali with Holt Method of Winter and Seasonal Autoregressive Integrated Moving Average (SARIMA

    Directory of Open Access Journals (Sweden)

    Agus Supriatna

    2017-11-01

    Full Text Available The tourism sector is one of the contributors of foreign exchange is quite influential in improving the economy of Indonesia. The development of this sector will have a positive impact, including employment opportunities and opportunities for entrepreneurship in various industries such as adventure tourism, craft or hospitality. The beauty and natural resources owned by Indonesia become a tourist attraction for domestic and foreign tourists. One of the many tourist destination is the island of Bali. The island of Bali is not only famous for its natural, cultural diversity and arts but there are also add the value of tourism. In 2015 the increase in the number of tourist arrivals amounted to 6.24% from the previous year. In improving the quality of services, facing a surge of visitors, or prepare a strategy in attracting tourists need a prediction of arrival so that planning can be more efficient and effective. This research used  Holt Winter's method and Seasonal Autoregressive Integrated Moving Average (SARIMA method  to predict tourist arrivals. Based on data of foreign tourist arrivals who visited the Bali island in January 2007 until June 2016, the result of Holt Winter's method with parameter values α=0.1 ,β=0.1 ,γ=0.3 has an error MAPE is 6,171873. While the result of SARIMA method with (0,1,1〖(1,0,0〗12 model has an error MAPE is 5,788615 and it can be concluded that SARIMA method is better. Keywords: Foreign Tourist, Prediction, Bali Island, Holt-Winter’s, SARIMA.

  3. Moving energies as first integrals of nonholonomic systems with affine constraints

    Science.gov (United States)

    Fassò, Francesco; García-Naranjo, Luis C.; Sansonetto, Nicola

    2018-03-01

    In nonholonomic mechanical systems with constraints that are affine (linear nonhomogeneous) functions of the velocities, the energy is typically not a first integral. It was shown in Fassò and Sansonetto (2016 J. Nonlinear Sci. 26 519-44) that, nevertheless, there exist modifications of the energy, called there moving energies, which under suitable conditions are first integrals. The first goal of this paper is to study the properties of these functions and the conditions that lead to their conservation. In particular, we enlarge the class of moving energies considered in Fassò and Sansonetto (2016 J. Nonlinear Sci. 26 519-44). The second goal of the paper is to demonstrate the relevance of moving energies in nonholonomic mechanics. We show that certain first integrals of some well known systems (the affine Veselova and LR systems), which had been detected on a case-by-case way, are instances of moving energies. Moreover, we determine conserved moving energies for a class of affine systems on Lie groups that include the LR systems, for a heavy convex rigid body that rolls without slipping on a uniformly rotating plane, and for an n-dimensional generalization of the Chaplygin sphere problem to a uniformly rotating hyperplane.

  4. A generalization of the preset count moving average algorithm for digital rate meters

    International Nuclear Information System (INIS)

    Arandjelovic, Vojislav; Koturovic, Aleksandar; Vukanovic, Radomir

    2002-01-01

    A generalized definition of the preset count moving average algorithm for digital rate meters has been introduced. The algorithm is based on the knowledge of time intervals between successive pulses in random-pulse sequences. The steady state and transient regimes of the algorithm have been characterized. A measure for statistical fluctuations of the successive measurement results has been introduced. The versatility of the generalized algorithm makes it suitable for application in the design of the software of modern measuring/control digital systems

  5. Quantifying walking and standing behaviour of dairy cows using a moving average based on output from an accelerometer

    DEFF Research Database (Denmark)

    Nielsen, Lars Relund; Pedersen, Asger Roer; Herskin, Mette S

    2010-01-01

    in sequences of approximately 20 s for the period of 10 min. Afterwards the cows were stimulated to move/lift the legs while standing in a cubicle. The behaviour was video recorded, and the recordings were analysed second by second for walking and standing behaviour as well as the number of steps taken....... Various algorithms for predicting walking/standing status were compared. The algorithms were all based on a limit of a moving average calculated by using one of two outputs of the accelerometer, either a motion index or a step count, and applied over periods of 3 or 5 s. Furthermore, we investigated...... the effect of additionally applying the rule: a walking period must last at least 5 s. The results indicate that the lowest misclassification rate (10%) of walking and standing was obtained based on the step count with a moving average of 3 s and with the rule applied. However, the rate of misclassification...

  6. Continuous processing of recombinant proteins: Integration of inclusion body solubilization and refolding using simulated moving bed size exclusion chromatography with buffer recycling.

    Science.gov (United States)

    Wellhoefer, Martin; Sprinzl, Wolfgang; Hahn, Rainer; Jungbauer, Alois

    2013-12-06

    An integrated process which combines continuous inclusion body dissolution with NaOH and continuous matrix-assisted refolding based on closed-loop simulated moving bed size exclusion chromatography was designed and experimentally evaluated at laboratory scale. Inclusion bodies from N(pro) fusion pep6His and N(pro) fusion MCP1 from high cell density fermentation were continuously dissolved with NaOH, filtered and mixed with concentrated refolding buffer prior to refolding by size exclusion chromatography (SEC). This process enabled an isocratic operation of the simulated moving bed (SMB) system with a closed-loop set-up with refolding buffer as the desorbent buffer and buffer recycling by concentrating the raffinate using tangential flow filtration. With this continuous refolding process, we increased the refolding and cleavage yield of both model proteins by 10% compared to batch dilution refolding. Furthermore, more than 99% of the refolding buffer of the raffinate could be recycled which reduced the buffer consumption significantly. Based on the actual refolding data, we compared throughput, productivity, and buffer consumption between two batch dilution refolding processes - one using urea for IB dissolution, the other one using NaOH for IB dissolution - and our continuous refolding process. The higher complexity of the continuous refolding process was rewarded with higher throughput and productivity as well as significantly lower buffer consumption compared to the batch dilution refolding processes. Copyright © 2013 Elsevier B.V. All rights reserved.

  7. Integration process and logistics results

    International Nuclear Information System (INIS)

    2004-01-01

    The Procurement and Logistics functions have gone through a process of integration since the beginning of integrated management of Asco and Vandellos II up to the present. These are functions that are likely to be designed for delivering a single product to the rest of the organization, defined from a high level of expectations, and that admit simplifications and materialization of synergy's as they are approached from an integrated perspective. The analyzed functions are as follows: Service and Material Purchasing, Warehouse and Material Management, and Documentation and General Services Management. In all case, to accomplish the integration, objectives, procedures and information systems were unified. As for the organization, a decision was made in each case on whether or not to out source. The decisive corporate strategy to integrate, resulting in actions such as moving corporate headquarters to Vandellos II, corporate consolidation, regulation of employment and implementation of the ENDESA Group Economic Information System (SIE) , has shaped this process, which at present can be considered as practically complete. (Author)

  8. Improving Construction Process through Integration and Concurrent Engineering

    Directory of Open Access Journals (Sweden)

    Malik Khalfan

    2012-11-01

    Full Text Available In an increasingly competitive business environment, improvedtime-to-market, reduced production cost, quality of the productand customer involvement are rapidly becoming the key successfactors for any product development process. Consequently, mostorganisations are moving towards the adoption of latest technologyand new management concepts and philosophies such as totalquality management and concurrent engineering (CE to bringimprovement in their product development process. This paperdiscusses the adoption of integrated processes and CE withinthe construction industry to enable construction organisations toimprove their project development process. It also discusses aproposed integrated database model for the construction projects,which should enable the construction process to improve, becomemore effective and more efficient.

  9. The application of moving average control charts for evaluating magnetic field quality on an individual magnet basis

    International Nuclear Information System (INIS)

    Pollock, D.A.; Gunst, R.F.; Schucany, W.R.

    1994-01-01

    SSC Collider Dipole Magnet field quality specifications define limits of variation for the population mean (Systematic) and standard deviation (RMS deviation) of allowed and unallowed multipole coefficients generated by the full collection of dipole magnets throughout the Collider operating cycle. A fundamental Quality Control issue is how to determine the acceptability of individual magnets during production, in other words taken one at a time and compared to the population parameters. Provided that the normal distribution assumptions hold, the random variation of multipoles for individual magnets may be evaluated by comparing the measured results to ± 3 x RMS tolerance, centered on the design nominal. To evaluate the local and cumulative systematic variation of the magnets against the distribution tolerance, individual magnet results need to be combined with others that come before it. This paper demonstrates a Statistical Quality Control method (the Unweighted Moving Average control chart) to evaluate individual magnet performance and process stability against population tolerances. The DESY/HERA Dipole cold skew quadrupole measurements for magnets in production order are used to evaluate non-stationarity of the mean over time for the cumulative set of magnets, as well as for a moving sample

  10. A General Representation Theorem for Integrated Vector Autoregressive Processes

    DEFF Research Database (Denmark)

    Franchi, Massimo

    We study the algebraic structure of an I(d) vector autoregressive process, where d is restricted to be an integer. This is useful to characterize its polynomial cointegrating relations and its moving average representation, that is to prove a version of the Granger representation theorem valid...

  11. Integral transforms of the quantum mechanical path integral: Hit function and path-averaged potential

    Science.gov (United States)

    Edwards, James P.; Gerber, Urs; Schubert, Christian; Trejo, Maria Anabel; Weber, Axel

    2018-04-01

    We introduce two integral transforms of the quantum mechanical transition kernel that represent physical information about the path integral. These transforms can be interpreted as probability distributions on particle trajectories measuring respectively the relative contribution to the path integral from paths crossing a given spatial point (the hit function) and the likelihood of values of the line integral of the potential along a path in the ensemble (the path-averaged potential).

  12. Shadowfax: Moving mesh hydrodynamical integration code

    Science.gov (United States)

    Vandenbroucke, Bert

    2016-05-01

    Shadowfax simulates galaxy evolution. Written in object-oriented modular C++, it evolves a mixture of gas, subject to the laws of hydrodynamics and gravity, and any collisionless fluid only subject to gravity, such as cold dark matter or stars. For the hydrodynamical integration, it makes use of a (co-) moving Lagrangian mesh. The code has a 2D and 3D version, contains utility programs to generate initial conditions and visualize simulation snapshots, and its input/output is compatible with a number of other simulation codes, e.g. Gadget2 (ascl:0003.001) and GIZMO (ascl:1410.003).

  13. Integration of air quality-related planning processes : report

    International Nuclear Information System (INIS)

    2004-05-01

    Several communities in British Columbia have conducted air quality, greenhouse gas, or community energy management plans. This report explored the possibility of integrating 3 community-based air quality-related planning processes into a single process and evaluated the use of these 3 processes by local governments and First Nations in identifying and addressing air quality-related objectives, and determined to what extent they could be integrated to achieve planning objectives for air quality, greenhouse gas emissions, and energy supply and conservation. The lessons learned from 9 case studies in British Columbia were presented. The purpose of the case studies was to examine how communities handled emissions and energy related inventory and planning work, as well as their experiences with, or considerations for, an integrated process. The lessons were grouped under several key themes including organization and stakeholder involvement; messaging and focus; leadership/champions; and resources and capacity. The report also outlined a framework for an integrated planning process and provided recommendations regarding how an integrated or complementary process could be performed. A number of next steps were also offered for the provincial government to move the concept of an integrated process forward with the assistance of other partners. These included identifying the resources required to support communities engaging in an integrated process as well as discussing the series of options for provincial support with key stakeholders. refs., tabs., figs

  14. 47 CFR 64.1801 - Geographic rate averaging and rate integration.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 3 2010-10-01 2010-10-01 false Geographic rate averaging and rate integration. 64.1801 Section 64.1801 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES (CONTINUED) MISCELLANEOUS RULES RELATING TO COMMON CARRIERS Geographic Rate Averaging and...

  15. Time Series ARIMA Models of Undergraduate Grade Point Average.

    Science.gov (United States)

    Rogers, Bruce G.

    The Auto-Regressive Integrated Moving Average (ARIMA) Models, often referred to as Box-Jenkins models, are regression methods for analyzing sequential dependent observations with large amounts of data. The Box-Jenkins approach, a three-stage procedure consisting of identification, estimation and diagnosis, was used to select the most appropriate…

  16. A Two-Factor Autoregressive Moving Average Model Based on Fuzzy Fluctuation Logical Relationships

    Directory of Open Access Journals (Sweden)

    Shuang Guan

    2017-10-01

    Full Text Available Many of the existing autoregressive moving average (ARMA forecast models are based on one main factor. In this paper, we proposed a new two-factor first-order ARMA forecast model based on fuzzy fluctuation logical relationships of both a main factor and a secondary factor of a historical training time series. Firstly, we generated a fluctuation time series (FTS for two factors by calculating the difference of each data point with its previous day, then finding the absolute means of the two FTSs. We then constructed a fuzzy fluctuation time series (FFTS according to the defined linguistic sets. The next step was establishing fuzzy fluctuation logical relation groups (FFLRGs for a two-factor first-order autoregressive (AR(1 model and forecasting the training data with the AR(1 model. Then we built FFLRGs for a two-factor first-order autoregressive moving average (ARMA(1,m model. Lastly, we forecasted test data with the ARMA(1,m model. To illustrate the performance of our model, we used real Taiwan Stock Exchange Capitalization Weighted Stock Index (TAIEX and Dow Jones datasets as a secondary factor to forecast TAIEX. The experiment results indicate that the proposed two-factor fluctuation ARMA method outperformed the one-factor method based on real historic data. The secondary factor may have some effects on the main factor and thereby impact the forecasting results. Using fuzzified fluctuations rather than fuzzified real data could avoid the influence of extreme values in historic data, which performs negatively while forecasting. To verify the accuracy and effectiveness of the model, we also employed our method to forecast the Shanghai Stock Exchange Composite Index (SHSECI from 2001 to 2015 and the international gold price from 2000 to 2010.

  17. Weighted estimates for the averaging integral operator

    Czech Academy of Sciences Publication Activity Database

    Opic, Bohumír; Rákosník, Jiří

    2010-01-01

    Roč. 61, č. 3 (2010), s. 253-262 ISSN 0010-0757 R&D Projects: GA ČR GA201/05/2033; GA ČR GA201/08/0383 Institutional research plan: CEZ:AV0Z10190503 Keywords : averaging integral operator * weighted Lebesgue spaces * weights Subject RIV: BA - General Mathematics Impact factor: 0.474, year: 2010 http://link.springer.com/article/10.1007%2FBF03191231

  18. Continuous processing of recombinant proteins: integration of refolding and purification using simulated moving bed size-exclusion chromatography with buffer recycling.

    Science.gov (United States)

    Wellhoefer, Martin; Sprinzl, Wolfgang; Hahn, Rainer; Jungbauer, Alois

    2014-04-11

    Continuous processing of recombinant proteins was accomplished by combining continuous matrix-assisted refolding and purification by tandem simulated moving bed (SMB) size-exclusion chromatography (SEC). Recombinant proteins, N(pro) fusion proteins from inclusion bodies were dissolved with NaOH and refolded in the SMB system with a closed-loop set-up with refolding buffer as the desorbent buffer and buffer recycling of the refolding buffer of the raffinate by tangential flow filtration. For further purification of the refolded proteins, a second SMB operation also based on SEC was added. The whole system could be operated isocratically with refolding buffer as the desorbent buffer, and buffer recycling could also be applied in the purification step. Thus, a significant reduction in buffer consumption was achieved. The system was evaluated with two proteins, the N(pro) fusion pep6His and N(pro) fusion MCP-1. Refolding solution, which contained residual N(pro) fusion peptide, the cleaved autoprotease N(pro), and the cleaved target peptide was used as feed solution. Full separation of the cleaved target peptide from residual proteins was achieved at a purity and recovery in the raffinate and extract, respectively, of approximately 100%. In addition, more than 99% of the refolding buffer of the raffinate was recycled. A comparison of throughput, productivity, and buffer consumption of the integrated continuous process with two batch processes demonstrated that up to 60-fold higher throughput, up to 180-fold higher productivity, and at least 28-fold lower buffer consumption can be obtained by the integrated continuous process, which compensates for the higher complexity. Copyright © 2014 Elsevier B.V. All rights reserved.

  19. O Moving Average Convergence Convergence-Divergence como Ferramenta para a Decisão de Investimentos no Mercado de Ações

    Directory of Open Access Journals (Sweden)

    Rodrigo Silva Vidotto

    2009-04-01

    Full Text Available The increase in the number of investors at Bovespa since 2000 is due to stabilized inflation and falling interest rates. The use of tools that assist investors in selling and buying stocks is very important in a competitive and risky market. The technical analysis of stocks is used to search for trends in the movements of share prices and therefore indicate a suitable moment to buy or sell stocks. Among these technical indicators is the Moving Average Convergence-Divergence [MACD], which uses the concept of moving average in its equation and is considered by financial analysts as a simple tool to operate and analyze. This article aims to assess the effectiveness of the use of the MACD to indicate the moment to purchase and sell stocks in five companies – selected at random – a total of ninety companies in the Bovespa New Market and analyze the profitability gained during 2006, taking as a reference the valorization of the Ibovespa exchange in that year. The results show that the cumulative average return of the five companies was of 26.7% against a cumulative average return of 0.90% for Ibovespa.

  20. Statistical aspects of autoregressive-moving average models in the assessment of radon mitigation

    International Nuclear Information System (INIS)

    Dunn, J.E.; Henschel, D.B.

    1989-01-01

    Radon values, as reflected by hourly scintillation counts, seem dominated by major, pseudo-periodic, random fluctuations. This methodological paper reports a moderate degree of success in modeling these data using relatively simple autoregressive-moving average models to assess the effectiveness of radon mitigation techniques in existing housing. While accounting for the natural correlation of successive observations, familiar summary statistics such as steady state estimates, standard errors, confidence limits, and tests of hypothesis are produced. The Box-Jenkins approach is used throughout. In particular, intervention analysis provides an objective means of assessing the effectiveness of an active mitigation measure, such as a fan off/on cycle. Occasionally, failure to declare a significant intervention has suggested a means of remedial action in the data collection procedure

  1. Neural networks prediction and fault diagnosis applied to stationary and non stationary ARMA (Autoregressive moving average) modeled time series

    International Nuclear Information System (INIS)

    Marseguerra, M.; Minoggio, S.; Rossi, A.; Zio, E.

    1992-01-01

    The correlated noise affecting many industrial plants under stationary or cyclo-stationary conditions - nuclear reactors included -has been successfully modeled by autoregressive moving average (ARMA) due to the versatility of this technique. The relatively recent neural network methods have similar features and much effort is being devoted to exploring their usefulness in forecasting and control. Identifying a signal by means of an ARMA model gives rise to the problem of selecting its correct order. Similar difficulties must be faced when applying neural network methods and, specifically, particular care must be given to the setting up of the appropriate network topology, the data normalization procedure and the learning code. In the present paper the capability of some neural networks of learning ARMA and seasonal ARMA processes is investigated. The results of the tested cases look promising since they indicate that the neural networks learn the underlying process with relative ease so that their forecasting capability may represent a convenient fault diagnosis tool. (Author)

  2. Electricity demand loads modeling using AutoRegressive Moving Average (ARMA) models

    Energy Technology Data Exchange (ETDEWEB)

    Pappas, S.S. [Department of Information and Communication Systems Engineering, University of the Aegean, Karlovassi, 83 200 Samos (Greece); Ekonomou, L.; Chatzarakis, G.E. [Department of Electrical Engineering Educators, ASPETE - School of Pedagogical and Technological Education, N. Heraklion, 141 21 Athens (Greece); Karamousantas, D.C. [Technological Educational Institute of Kalamata, Antikalamos, 24100 Kalamata (Greece); Katsikas, S.K. [Department of Technology Education and Digital Systems, University of Piraeus, 150 Androutsou Srt., 18 532 Piraeus (Greece); Liatsis, P. [Division of Electrical Electronic and Information Engineering, School of Engineering and Mathematical Sciences, Information and Biomedical Engineering Centre, City University, Northampton Square, London EC1V 0HB (United Kingdom)

    2008-09-15

    This study addresses the problem of modeling the electricity demand loads in Greece. The provided actual load data is deseasonilized and an AutoRegressive Moving Average (ARMA) model is fitted on the data off-line, using the Akaike Corrected Information Criterion (AICC). The developed model fits the data in a successful manner. Difficulties occur when the provided data includes noise or errors and also when an on-line/adaptive modeling is required. In both cases and under the assumption that the provided data can be represented by an ARMA model, simultaneous order and parameter estimation of ARMA models under the presence of noise are performed. The produced results indicate that the proposed method, which is based on the multi-model partitioning theory, tackles successfully the studied problem. For validation purposes the produced results are compared with three other established order selection criteria, namely AICC, Akaike's Information Criterion (AIC) and Schwarz's Bayesian Information Criterion (BIC). The developed model could be useful in the studies that concern electricity consumption and electricity prices forecasts. (author)

  3. Ergodic averages via dominating processes

    DEFF Research Database (Denmark)

    Møller, Jesper; Mengersen, Kerrie

    2006-01-01

    We show how the mean of a monotone function (defined on a state space equipped with a partial ordering) can be estimated, using ergodic averages calculated from upper and lower dominating processes of a stationary irreducible Markov chain. In particular, we do not need to simulate the stationary...... Markov chain and we eliminate the problem of whether an appropriate burn-in is determined or not. Moreover, when a central limit theorem applies, we show how confidence intervals for the mean can be estimated by bounding the asymptotic variance of the ergodic average based on the equilibrium chain....

  4. Limit theorems for stationary increments Lévy driven moving averages

    DEFF Research Database (Denmark)

    Basse-O'Connor, Andreas; Lachièze-Rey, Raphaël; Podolskij, Mark

    of the kernel function g at 0. First order asymptotic theory essentially comprise three cases: stable convergence towards a certain infinitely divisible distribution, an ergodic type limit theorem and convergence in probability towards an integrated random process. We also prove the second order limit theorem...

  5. Spark formation as a moving boundary process

    Science.gov (United States)

    Ebert, Ute

    2006-03-01

    The growth process of spark channels recently becomes accessible through complementary methods. First, I will review experiments with nanosecond photographic resolution and with fast and well defined power supplies that appropriately resolve the dynamics of electric breakdown [1]. Second, I will discuss the elementary physical processes as well as present computations of spark growth and branching with adaptive grid refinement [2]. These computations resolve three well separated scales of the process that emerge dynamically. Third, this scale separation motivates a hierarchy of models on different length scales. In particular, I will discuss a moving boundary approximation for the ionization fronts that generate the conducting channel. The resulting moving boundary problem shows strong similarities with classical viscous fingering. For viscous fingering, it is known that the simplest model forms unphysical cusps within finite time that are suppressed by a regularizing condition on the moving boundary. For ionization fronts, we derive a new condition on the moving boundary of mixed Dirichlet-Neumann type (φ=ɛnφ) that indeed regularizes all structures investigated so far. In particular, we present compact analytical solutions with regularization, both for uniformly translating shapes and for their linear perturbations [3]. These solutions are so simple that they may acquire a paradigmatic role in the future. Within linear perturbation theory, they explicitly show the convective stabilization of a curved front while planar fronts are linearly unstable against perturbations of arbitrary wave length. [1] T.M.P. Briels, E.M. van Veldhuizen, U. Ebert, TU Eindhoven. [2] C. Montijn, J. Wackers, W. Hundsdorfer, U. Ebert, CWI Amsterdam. [3] B. Meulenbroek, U. Ebert, L. Schäfer, Phys. Rev. Lett. 95, 195004 (2005).

  6. Total Ore Processing Integration and Management

    Energy Technology Data Exchange (ETDEWEB)

    Leslie Gertsch

    2006-05-15

    This report outlines the technical progress achieved for project DE-FC26-03NT41785 (Total Ore Processing Integration and Management) during the period 01 January through 31 March of 2006. (1) Work in Progress: Minntac Mine--Graphical analysis of drill monitor data moved from two-dimensional horizontal patterns to vertical variations in measured and calculated parameters. The rock quality index and the two dimensionless ({pi}) indices developed by Kewen Yin of the University of Minnesota are used by Minntac Mine to design their blasts, but the drill monitor data from any given pattern is obviously not available for the design of that shot. Therefore, the blast results--which are difficult to quantify in a short time--must be back-analyzed for comparison with the drill monitor data to be useful for subsequent blast designs. {pi}{sub 1} indicates the performance of the drill, while {pi}{sub 2} is a measure of the rock resistance to drilling. As would be expected, since a drill tends to perform better in rock that offers little resistance, {pi}{sub 1} and {pi}{sub 2} are strongly inversely correlated; the relationship is a power function rather than simply linear. Low values of each Pi index tend to be quantized, indicating that these two parameters may be most useful above certain minimum magnitudes. (2) Work in Progress: Hibtac Mine--Statistical examination of a data set from Hibtac Mine (Table 1) shows that incorporating information on the size distribution of material feeding from the crusher to the autogenous mills improves the predictive capability of the model somewhat (43% vs. 44% correlation coefficient), but a more important component is production data from preceding days (26% vs. 44% correlation coefficient), determined using exponentially weighted moving average predictive variables. This lag effect likely reflects the long and varied residence times of the different size fragments in the grinding mills. The rock sizes are also correlated with the geologic

  7. Shape and depth determinations from second moving average residual self-potential anomalies

    International Nuclear Information System (INIS)

    Abdelrahman, E M; El-Araby, T M; Essa, K S

    2009-01-01

    We have developed a semi-automatic method to determine the depth and shape (shape factor) of a buried structure from second moving average residual self-potential anomalies obtained from observed data using filters of successive window lengths. The method involves using a relationship between the depth and the shape to source and a combination of windowed observations. The relationship represents a parametric family of curves (window curves). For a fixed window length, the depth is determined for each shape factor. The computed depths are plotted against the shape factors, representing a continuous monotonically increasing curve. The solution for the shape and depth is read at the common intersection of the window curves. The validity of the method is tested on a synthetic example with and without random errors and on two field examples from Turkey and Germany. In all cases examined, the depth and the shape solutions obtained are in very good agreement with the true ones

  8. Moving Average Filter-Based Phase-Locked Loops: Performance Analysis and Design Guidelines

    DEFF Research Database (Denmark)

    Golestan, Saeed; Ramezani, Malek; Guerrero, Josep M.

    2014-01-01

    this challenge, incorporating moving average filter(s) (MAF) into the PLL structure has been proposed in some recent literature. A MAF is a linear-phase finite impulse response filter which can act as an ideal low-pass filter, if certain conditions hold. The main aim of this paper is to present the control...... design guidelines for a typical MAF-based PLL. The paper starts with the general description of MAFs. The main challenge associated with using the MAFs is then explained, and its possible solutions are discussed. The paper then proceeds with a brief overview of the different MAF-based PLLs. In each case......, the PLL block diagram description is shown, the advantages and limitations are briefly discussed, and the tuning approach (if available) is evaluated. The paper then presents two systematic methods to design the control parameters of a typical MAF-based PLL: one for the case of using a proportional...

  9. A robust combination approach for short-term wind speed forecasting and analysis – Combination of the ARIMA (Autoregressive Integrated Moving Average), ELM (Extreme Learning Machine), SVM (Support Vector Machine) and LSSVM (Least Square SVM) forecasts using a GPR (Gaussian Process Regression) model

    International Nuclear Information System (INIS)

    Wang, Jianzhou; Hu, Jianming

    2015-01-01

    With the increasing importance of wind power as a component of power systems, the problems induced by the stochastic and intermittent nature of wind speed have compelled system operators and researchers to search for more reliable techniques to forecast wind speed. This paper proposes a combination model for probabilistic short-term wind speed forecasting. In this proposed hybrid approach, EWT (Empirical Wavelet Transform) is employed to extract meaningful information from a wind speed series by designing an appropriate wavelet filter bank. The GPR (Gaussian Process Regression) model is utilized to combine independent forecasts generated by various forecasting engines (ARIMA (Autoregressive Integrated Moving Average), ELM (Extreme Learning Machine), SVM (Support Vector Machine) and LSSVM (Least Square SVM)) in a nonlinear way rather than the commonly used linear way. The proposed approach provides more probabilistic information for wind speed predictions besides improving the forecasting accuracy for single-value predictions. The effectiveness of the proposed approach is demonstrated with wind speed data from two wind farms in China. The results indicate that the individual forecasting engines do not consistently forecast short-term wind speed for the two sites, and the proposed combination method can generate a more reliable and accurate forecast. - Highlights: • The proposed approach can make probabilistic modeling for wind speed series. • The proposed approach adapts to the time-varying characteristic of the wind speed. • The hybrid approach can extract the meaningful components from the wind speed series. • The proposed method can generate adaptive, reliable and more accurate forecasting results. • The proposed model combines four independent forecasting engines in a nonlinear way.

  10. Flue gas cleanup using the Moving-Bed Copper Oxide Process

    Energy Technology Data Exchange (ETDEWEB)

    Pennline, Henry W; Hoffman, James S

    2013-10-01

    The use of copper oxide on a support had been envisioned as a gas cleanup technique to remove sulfur dioxide (SO{sub 2}) and nitric oxides (NO{sub x}) from flue gas produced by the combustion of coal for electric power generation. In general, dry, regenerable flue gas cleanup techniques that use a sorbent can have various advantages, such as simultaneous removal of pollutants, production of a salable by-product, and low costs when compared to commercially available wet scrubbing technology. Due to the temperature of reaction, the placement of the process into an advanced power system could actually increase the thermal efficiency of the plant. The Moving-Bed Copper Oxide Process is capable of simultaneously removing sulfur oxides and nitric oxides within the reactor system. In this regenerable sorbent technique, the use of the copper oxide sorbent was originally in a fluidized bed, but the more recent effort developed the use of the sorbent in a moving-bed reactor design. A pilot facility or life-cycle test system was constructed so that an integrated testing of the sorbent over absorption/regeneration cycles could be conducted. A parametric study of the total process was then performed where all process steps, including absorption and regeneration, were continuously operated and experimentally evaluated. The parametric effects, including absorption temperature, sorbent and gas residence times, inlet SO{sub 2} and NO{sub x} concentration, and flyash loadings, on removal efficiencies and overall operational performance were determined. Although some of the research results have not been previously published because of previous collaborative restrictions, a summary of these past findings is presented in this communication. Additionally, the potential use of the process for criteria pollutant removal in oxy-firing of fossil fuel for carbon sequestration purposes is discussed.

  11. Application of a Combined Model with Autoregressive Integrated Moving Average (ARIMA and Generalized Regression Neural Network (GRNN in Forecasting Hepatitis Incidence in Heng County, China.

    Directory of Open Access Journals (Sweden)

    Wudi Wei

    Full Text Available Hepatitis is a serious public health problem with increasing cases and property damage in Heng County. It is necessary to develop a model to predict the hepatitis epidemic that could be useful for preventing this disease.The autoregressive integrated moving average (ARIMA model and the generalized regression neural network (GRNN model were used to fit the incidence data from the Heng County CDC (Center for Disease Control and Prevention from January 2005 to December 2012. Then, the ARIMA-GRNN hybrid model was developed. The incidence data from January 2013 to December 2013 were used to validate the models. Several parameters, including mean absolute error (MAE, root mean square error (RMSE, mean absolute percentage error (MAPE and mean square error (MSE, were used to compare the performance among the three models.The morbidity of hepatitis from Jan 2005 to Dec 2012 has seasonal variation and slightly rising trend. The ARIMA(0,1,2(1,1,112 model was the most appropriate one with the residual test showing a white noise sequence. The smoothing factor of the basic GRNN model and the combined model was 1.8 and 0.07, respectively. The four parameters of the hybrid model were lower than those of the two single models in the validation. The parameters values of the GRNN model were the lowest in the fitting of the three models.The hybrid ARIMA-GRNN model showed better hepatitis incidence forecasting in Heng County than the single ARIMA model and the basic GRNN model. It is a potential decision-supportive tool for controlling hepatitis in Heng County.

  12. Average equilibrium charge state of 278113 ions moving in a helium gas

    International Nuclear Information System (INIS)

    Kaji, D.; Morita, K.; Morimoto, K.

    2005-01-01

    Difficulty to identify a new heavy element comes from the small production cross section. For example, the production cross section was about 0.5 pb in the case of searching for the 112th element produced by the cold fusion reaction of 208 Pb( 70 Zn,n) 277 ll2. In order to identify heavier elements than element 112, the experimental apparatus with a sensitivity of sub-pico barn level is essentially needed. A gas-filled recoil separator, in general, has a large collection efficiency compared with other recoil separators as seen from the operation principle of a gas-filled recoil separator. One of the most important parameters for a gas-filled recoil separator is the average equilibrium charge state q ave of ions moving in a used gas. This is because the recoil ion can not be properly transported to the focal plane of the separator, if the q ave of an element of interest in a gas is unknown. We have systematically measured equilibrium charge state distributions of heavy ions ( 169 Tm, 208 Pb, 193,209 Bi, 196 Po, 200 At, 203,204 Fr, 212 Ac, 234 Bk, 245 Fm, 254 No, 255 Lr, and 265 Hs) moving in a helium gas by using the gas-filled recoil separator GARIS at RIKEN. Ana then, the empirical formula on q ave of heavy ions in a helium gas was derived as a function of the velocity and the atomic number of an ion on the basis of the Tomas-Fermi model of the atom. The formula was found to be applicable to search for transactinide nuclides of 271 Ds, 272 Rg, and 277 112 produced by cold fusion reactions. Using the formula on q ave , we searched for a new isotope of element 113 produced by the cold fusion reaction of 209 Bi( 70 Zn,n) 278 113. As a result, a decay chain due to an evaporation residue of 278 113 was observed. Recently, we have successfully observed the 2nd decay chain due to an evaporation residue of 278 113. In this report, we will present experimental results in detail, and will also discuss the average equilibrium charge sate of 278 113 in a helium gas by

  13. Assessing the Value of Moving More-The Integral Role of Qualified Health Professionals.

    Science.gov (United States)

    Arena, Ross; McNeil, Amy; Lavie, Carl J; Ozemek, Cemal; Forman, Daniel; Myers, Jonathan; Laddu, Deepika R; Popovic, Dejana; Rouleau, Codie R; Campbell, Tavis S; Hills, Andrew P

    2018-04-01

    Being physically active or, in a broader sense, simply moving more throughout each day is one of the most important components of an individual's health plan. In conjunction with regular exercise training, taking more steps in a day and sitting less are also important components of one's movement portfolio. Given this priority, health care professionals must develop enhanced skills for prescribing and guiding individualized movement programs for all their patients. An important component of a health care professional's ability to prescribe movement as medicine is competency in assessing an individual's risk for untoward events if physical exertion was increased. The ability to appropriately assess one's risk before advising an individual to move more is integral to clinical decision-making related to subsequent testing if needed, exercise prescription, and level of supervision with exercise training. At present, there is a lack of clarity pertaining to how a health care professional should go about assessing an individual's readiness to move more on a daily basis in a safe manner. Therefore, this perspectives article clarifies key issues related to prescribing movement as medicine and presents a new process for clinical assessment before prescribing an individualized movement program. Copyright © 2018 Elsevier B.V. All rights reserved.

  14. Robust nonlinear autoregressive moving average model parameter estimation using stochastic recurrent artificial neural networks

    DEFF Research Database (Denmark)

    Chon, K H; Hoyer, D; Armoundas, A A

    1999-01-01

    In this study, we introduce a new approach for estimating linear and nonlinear stochastic autoregressive moving average (ARMA) model parameters, given a corrupt signal, using artificial recurrent neural networks. This new approach is a two-step approach in which the parameters of the deterministic...... part of the stochastic ARMA model are first estimated via a three-layer artificial neural network (deterministic estimation step) and then reestimated using the prediction error as one of the inputs to the artificial neural networks in an iterative algorithm (stochastic estimation step). The prediction...... error is obtained by subtracting the corrupt signal of the estimated ARMA model obtained via the deterministic estimation step from the system output response. We present computer simulation examples to show the efficacy of the proposed stochastic recurrent neural network approach in obtaining accurate...

  15. Holistic Processing of Static and Moving Faces

    Science.gov (United States)

    Zhao, Mintao; Bülthoff, Isabelle

    2017-01-01

    Humans' face ability develops and matures with extensive experience in perceiving, recognizing, and interacting with faces that move most of the time. However, how facial movements affect 1 core aspect of face ability--holistic face processing--remains unclear. Here we investigated the influence of rigid facial motion on holistic and part-based…

  16. FPGA based computation of average neutron flux and e-folding period for start-up range of reactors

    International Nuclear Information System (INIS)

    Ram, Rajit; Borkar, S.P.; Dixit, M.Y.; Das, Debashis

    2013-01-01

    Pulse processing instrumentation channels used for reactor applications, play a vital role to ensure nuclear safety in startup range of reactor operation and also during fuel loading and first approach to criticality. These channels are intended for continuous run time computation of equivalent reactor core neutron flux and e-folding period. This paper focuses only the computational part of these instrumentation channels which is implemented in single FPGA using 32-bit floating point arithmetic engine. The computations of average count rate, log of average count rate, log rate and reactor period are done in VHDL using digital circuit realization approach. The computation of average count rate is done using fully adaptive window size moving average method, while Taylor series expansion for logarithms is implemented in FPGA to compute log of count rate, log rate and reactor e-folding period. This paper describes the block diagrams of digital logic realization in FPGA and advantage of fully adaptive window size moving average technique over conventional fixed size moving average technique for pulse processing of reactor instrumentations. (author)

  17. Synchronized moving aperture radiation therapy (SMART): average tumour trajectory for lung patients

    International Nuclear Information System (INIS)

    Neicu, Toni; Shirato, Hiroki; Seppenwoolde, Yvette; Jiang, Steve B

    2003-01-01

    Synchronized moving aperture radiation therapy (SMART) is a new technique for treating mobile tumours under development at Massachusetts General Hospital (MGH). The basic idea of SMART is to synchronize the moving radiation beam aperture formed by a dynamic multileaf collimator (DMLC) with the tumour motion induced by respiration. SMART is based on the concept of the average tumour trajectory (ATT) exhibited by a tumour during respiration. During the treatment simulation stage, tumour motion is measured and the ATT is derived. Then, the original IMRT MLC leaf sequence is modified using the ATT to compensate for tumour motion. During treatment, the tumour motion is monitored. The treatment starts when leaf motion and tumour motion are synchronized at a specific breathing phase. The treatment will halt when the tumour drifts away from the ATT and will resume when the synchronization between tumour motion and radiation beam is re-established. In this paper, we present a method to derive the ATT from measured tumour trajectory data. We also investigate the validity of the ATT concept for lung tumours during normal breathing. The lung tumour trajectory data were acquired during actual radiotherapy sessions using a real-time tumour-tracking system. SMART treatment is simulated by assuming that the radiation beam follows the derived ATT and the tumour follows the measured trajectory. In simulation, the treatment starts at exhale phase. The duty cycle of SMART delivery was calculated for various treatment times and gating thresholds, as well as for various exhale phases where the treatment begins. The simulation results show that in the case of free breathing, for 4 out of 11 lung datasets with tumour motion greater than 1 cm from peak to peak, the error in tumour tracking can be controlled to within a couple of millimetres while maintaining a reasonable delivery efficiency. That is to say, without any breath coaching/control, the ATT is a valid concept for some lung

  18. Macrotransport processes: Brownian tracers as stochastic averagers in effective medium theories of heterogeneous media

    International Nuclear Information System (INIS)

    Brenner, H.

    1991-01-01

    Macrotransport processes (generalized Taylor dispersion phenomena) constitute coarse-grained descriptions of comparable convective diffusive-reactive microtransport processes, the latter supposed governed by microscale linear constitutive equations and boundary conditions, but characterized by spatially nonuniform phenomenological coefficients. Following a brief review of existing applications of the theory, the author focuses - by way of background information-upon the original (and now classical) Taylor - Aris dispersion problem, involving the combined convective and molecular diffusive transport of a point-size Brownian solute molecule (tracer) suspended in a Poiseuille solvent flow within a circular tube. A series of elementary generalizations of this prototype problem to chromatographic-like solute transport processes in tubes is used to illustrate some novel statistical-physical features. These examples emphasize the fact that a solute molecule may, on average, move axially down the tube at a different mean velocity (either larger or smaller) than that of a solvent molecule. Moreover, this solute molecule may suffer axial dispersion about its mean velocity at a rate greatly exceeding that attributable to its axial molecular diffusion alone. Such chromatographic anomalies represent novel macroscale non-linearities originating from physicochemical interactions between spatially inhomogeneous convective-diffusive-reactive microtransport processes

  19. Average multiplications in deep inelastic processes and their interpretation

    International Nuclear Information System (INIS)

    Kiselev, A.V.; Petrov, V.A.

    1983-01-01

    Inclusive production of hadrons in deep inelastic proceseseus is considered. It is shown that at high energies the jet evolution in deep inelastic processes is mainly of nonperturbative character. With the increase of a final hadron state energy the leading contribution to an average multiplicity comes from a parton subprocess due to production of massive quark and gluon jets and their further fragmentation as diquark contribution becomes less and less essential. The ratio of the total average multiplicity in deep inelastic processes to the average multiplicity in e + e - -annihilation at high energies tends to unity

  20. The entanglement of two moving atoms interacting with a single-mode field via a three-photon process

    International Nuclear Information System (INIS)

    Chao, Wu; Mao-Fa, Fang

    2010-01-01

    In this paper, the entanglement of two moving atoms induced by a single-mode field via a three-photon process is investigated. It is shown that the entanglement is dependent on the category of the field, the average photon number N, the number p of half-wave lengths of the field mode and the atomic initial state. Also, the sudden death and the sudden birth of the entanglement are detected in this model and the results show that the existence of the sudden death and the sudden birth depends on the parameter and the category of the mode field. In addition, the three-photon process is a higher order nonlinear process. (general)

  1. "Let's Move" campaign: applying the extended parallel process model.

    Science.gov (United States)

    Batchelder, Alicia; Matusitz, Jonathan

    2014-01-01

    This article examines Michelle Obama's health campaign, "Let's Move," through the lens of the extended parallel process model (EPPM). "Let's Move" aims to reduce the childhood obesity epidemic in the United States. Developed by Kim Witte, EPPM rests on the premise that people's attitudes can be changed when fear is exploited as a factor of persuasion. Fear appeals work best (a) when a person feels a concern about the issue or situation, and (b) when he or she believes to have the capability of dealing with that issue or situation. Overall, the analysis found that "Let's Move" is based on past health campaigns that have been successful. An important element of the campaign is the use of fear appeals (as it is postulated by EPPM). For example, part of the campaign's strategies is to explain the severity of the diseases associated with obesity. By looking at the steps of EPPM, readers can also understand the strengths and weaknesses of "Let's Move."

  2. Consensus in averager-copier-voter networks of moving dynamical agents

    Science.gov (United States)

    Shang, Yilun

    2017-02-01

    This paper deals with a hybrid opinion dynamics comprising averager, copier, and voter agents, which ramble as random walkers on a spatial network. Agents exchange information following some deterministic and stochastic protocols if they reside at the same site in the same time. Based on stochastic stability of Markov chains, sufficient conditions guaranteeing consensus in the sense of almost sure convergence have been obtained. The ultimate consensus state is identified in the form of an ergodicity result. Simulation studies are performed to validate the effectiveness and availability of our theoretical results. The existence/non-existence of voters and the proportion of them are unveiled to play key roles during the consensus-reaching process.

  3. An integrated approach for visual analysis of a multisource moving objects knowledge base

    NARCIS (Netherlands)

    Willems, N.; van Hage, W.R.; de Vries, G.; Janssens, J.H.M.; Malaisé, V.

    2010-01-01

    We present an integrated and multidisciplinary approach for analyzing the behavior of moving objects. The results originate from an ongoing research of four different partners from the Dutch Poseidon project (Embedded Systems Institute (2007)), which aims to develop new methods for Maritime Safety

  4. Tracking integration in concentrating photovoltaics using laterally moving optics.

    Science.gov (United States)

    Duerr, Fabian; Meuret, Youri; Thienpont, Hugo

    2011-05-09

    In this work the concept of tracking-integrated concentrating photovoltaics is studied and its capabilities are quantitatively analyzed. The design strategy desists from ideal concentration performance to reduce the external mechanical solar tracking effort in favor of a compact installation, possibly resulting in lower overall cost. The proposed optical design is based on an extended Simultaneous Multiple Surface (SMS) algorithm and uses two laterally moving plano-convex lenses to achieve high concentration over a wide angular range of ±24°. It achieves 500× concentration, outperforming its conventional concentrating photovoltaic counterparts on a polar aligned single axis tracker.

  5. Asymptotic behaviour of time averages for non-ergodic Gaussian processes

    Science.gov (United States)

    Ślęzak, Jakub

    2017-08-01

    In this work, we study the behaviour of time-averages for stationary (non-ageing), but ergodicity-breaking Gaussian processes using their representation in Fourier space. We provide explicit formulae for various time-averaged quantities, such as mean square displacement, density, and analyse the behaviour of time-averaged characteristic function, which gives insight into rich memory structure of the studied processes. Moreover, we show applications of the ergodic criteria in Fourier space, determining the ergodicity of the generalised Langevin equation's solutions.

  6. Heterogeneous CPU-GPU moving targets detection for UAV video

    Science.gov (United States)

    Li, Maowen; Tang, Linbo; Han, Yuqi; Yu, Chunlei; Zhang, Chao; Fu, Huiquan

    2017-07-01

    Moving targets detection is gaining popularity in civilian and military applications. On some monitoring platform of motion detection, some low-resolution stationary cameras are replaced by moving HD camera based on UAVs. The pixels of moving targets in the HD Video taken by UAV are always in a minority, and the background of the frame is usually moving because of the motion of UAVs. The high computational cost of the algorithm prevents running it at higher resolutions the pixels of frame. Hence, to solve the problem of moving targets detection based UAVs video, we propose a heterogeneous CPU-GPU moving target detection algorithm for UAV video. More specifically, we use background registration to eliminate the impact of the moving background and frame difference to detect small moving targets. In order to achieve the effect of real-time processing, we design the solution of heterogeneous CPU-GPU framework for our method. The experimental results show that our method can detect the main moving targets from the HD video taken by UAV, and the average process time is 52.16ms per frame which is fast enough to solve the problem.

  7. A fiber orientation-adapted integration scheme for computing the hyperelastic Tucker average for short fiber reinforced composites

    Science.gov (United States)

    Goldberg, Niels; Ospald, Felix; Schneider, Matti

    2017-10-01

    In this article we introduce a fiber orientation-adapted integration scheme for Tucker's orientation averaging procedure applied to non-linear material laws, based on angular central Gaussian fiber orientation distributions. This method is stable w.r.t. fiber orientations degenerating into planar states and enables the construction of orthotropic hyperelastic energies for truly orthotropic fiber orientation states. We establish a reference scenario for fitting the Tucker average of a transversely isotropic hyperelastic energy, corresponding to a uni-directional fiber orientation, to microstructural simulations, obtained by FFT-based computational homogenization of neo-Hookean constituents. We carefully discuss ideas for accelerating the identification process, leading to a tremendous speed-up compared to a naive approach. The resulting hyperelastic material map turns out to be surprisingly accurate, simple to integrate in commercial finite element codes and fast in its execution. We demonstrate the capabilities of the extracted model by a finite element analysis of a fiber reinforced chain link.

  8. Thermodynamic Integration Methods, Infinite Swapping and the Calculation of Generalized Averages

    OpenAIRE

    Doll, J. D.; Dupuis, P.; Nyquist, P.

    2016-01-01

    In the present paper we examine the risk-sensitive and sampling issues associated with the problem of calculating generalized averages. By combining thermodynamic integration and Stationary Phase Monte Carlo techniques, we develop an approach for such problems and explore its utility for a prototypical class of applications.

  9. Numerical Simulation of the Moving Induction Heating Process with Magnetic Flux Concentrator

    Directory of Open Access Journals (Sweden)

    Feng Li

    2013-01-01

    Full Text Available The induction heating with ferromagnetic metal powder bonded magnetic flux concentrator (MPB-MFC demonstrates more advantages in surface heating treatments of metal. However, the moving heating application is mostly applied in the industrial production. Therefore, the analytical understanding of the mechanism, efficiency, and controllability of the moving induction heating process becomes necessary for process design and optimization. This paper studies the mechanism of the moving induction heating with magnetic flux concentrator. The MPB-MFC assisted moving induction heating for Inconel 718 alloy is studied by establishing the finite element simulation model. The temperature field distribution is analyzed, and the factors influencing the temperature are studied. The conclusion demonstrates that the velocity of the workpiece should be controlled properly and the heat transfer coefficient (HTC has little impact on the temperature development, compared with other input parameters. In addition, the validity of the static numerical model is verified by comparing the finite element simulation with experimental results on AISI 1045 steel. The numerical model established in this work can provide comprehensive understanding for the process control in production.

  10. Integrated stationary Ornstein-Uhlenbeck process, and double integral processes

    Science.gov (United States)

    Abundo, Mario; Pirozzi, Enrica

    2018-03-01

    We find a representation of the integral of the stationary Ornstein-Uhlenbeck (ISOU) process in terms of Brownian motion Bt; moreover, we show that, under certain conditions on the functions f and g , the double integral process (DIP) D(t) = ∫βt g(s) (∫αs f(u) dBu) ds can be thought as the integral of a suitable Gauss-Markov process. Some theoretical and application details are given, among them we provide a simulation formula based on that representation by which sample paths, probability densities and first passage times of the ISOU process are obtained; the first-passage times of the DIP are also studied.

  11. Using autoregressive integrated moving average (ARIMA models to predict and monitor the number of beds occupied during a SARS outbreak in a tertiary hospital in Singapore

    Directory of Open Access Journals (Sweden)

    Earnest Arul

    2005-05-01

    Full Text Available Abstract Background The main objective of this study is to apply autoregressive integrated moving average (ARIMA models to make real-time predictions on the number of beds occupied in Tan Tock Seng Hospital, during the recent SARS outbreak. Methods This is a retrospective study design. Hospital admission and occupancy data for isolation beds was collected from Tan Tock Seng hospital for the period 14th March 2003 to 31st May 2003. The main outcome measure was daily number of isolation beds occupied by SARS patients. Among the covariates considered were daily number of people screened, daily number of people admitted (including observation, suspect and probable cases and days from the most recent significant event discovery. We utilized the following strategy for the analysis. Firstly, we split the outbreak data into two. Data from 14th March to 21st April 2003 was used for model development. We used structural ARIMA models in an attempt to model the number of beds occupied. Estimation is via the maximum likelihood method using the Kalman filter. For the ARIMA model parameters, we considered the simplest parsimonious lowest order model. Results We found that the ARIMA (1,0,3 model was able to describe and predict the number of beds occupied during the SARS outbreak well. The mean absolute percentage error (MAPE for the training set and validation set were 5.7% and 8.6% respectively, which we found was reasonable for use in the hospital setting. Furthermore, the model also provided three-day forecasts of the number of beds required. Total number of admissions and probable cases admitted on the previous day were also found to be independent prognostic factors of bed occupancy. Conclusion ARIMA models provide useful tools for administrators and clinicians in planning for real-time bed capacity during an outbreak of an infectious disease such as SARS. The model could well be used in planning for bed-capacity during outbreaks of other infectious

  12. Using autoregressive integrated moving average (ARIMA) models to predict and monitor the number of beds occupied during a SARS outbreak in a tertiary hospital in Singapore.

    Science.gov (United States)

    Earnest, Arul; Chen, Mark I; Ng, Donald; Sin, Leo Yee

    2005-05-11

    The main objective of this study is to apply autoregressive integrated moving average (ARIMA) models to make real-time predictions on the number of beds occupied in Tan Tock Seng Hospital, during the recent SARS outbreak. This is a retrospective study design. Hospital admission and occupancy data for isolation beds was collected from Tan Tock Seng hospital for the period 14th March 2003 to 31st May 2003. The main outcome measure was daily number of isolation beds occupied by SARS patients. Among the covariates considered were daily number of people screened, daily number of people admitted (including observation, suspect and probable cases) and days from the most recent significant event discovery. We utilized the following strategy for the analysis. Firstly, we split the outbreak data into two. Data from 14th March to 21st April 2003 was used for model development. We used structural ARIMA models in an attempt to model the number of beds occupied. Estimation is via the maximum likelihood method using the Kalman filter. For the ARIMA model parameters, we considered the simplest parsimonious lowest order model. We found that the ARIMA (1,0,3) model was able to describe and predict the number of beds occupied during the SARS outbreak well. The mean absolute percentage error (MAPE) for the training set and validation set were 5.7% and 8.6% respectively, which we found was reasonable for use in the hospital setting. Furthermore, the model also provided three-day forecasts of the number of beds required. Total number of admissions and probable cases admitted on the previous day were also found to be independent prognostic factors of bed occupancy. ARIMA models provide useful tools for administrators and clinicians in planning for real-time bed capacity during an outbreak of an infectious disease such as SARS. The model could well be used in planning for bed-capacity during outbreaks of other infectious diseases as well.

  13. SAR Ground Moving Target Indication Based on Relative Residue of DPCA Processing

    Directory of Open Access Journals (Sweden)

    Jia Xu

    2016-10-01

    Full Text Available For modern synthetic aperture radar (SAR, it has much more urgent demands on ground moving target indication (GMTI, which includes not only the point moving targets like cars, truck or tanks but also the distributed moving targets like river or ocean surfaces. Among the existing GMTI methods, displaced phase center antenna (DPCA can effectively cancel the strong ground clutter and has been widely used. However, its detection performance is closely related to the target’s signal-to-clutter ratio (SCR as well as radial velocity, and it cannot effectively detect the weak large-sized river surfaces in strong ground clutter due to their low SCR caused by specular scattering. This paper proposes a novel method called relative residue of DPCA (RR-DPCA, which jointly utilizes the DPCA cancellation outputs and the multi-look images to improve the detection performance of weak river surfaces. Furthermore, based on the statistics analysis of the RR-DPCA outputs on the homogenous background, the cell average (CA method can be well applied for subsequent constant false alarm rate (CFAR detection. The proposed RR-DPCA method can well detect the point moving targets and distributed moving targets simultaneously. Finally, the results of both simulated and real data are provided to demonstrate the effectiveness of the proposed SAR/GMTI method.

  14. An Integrated Approach for Visual Analysis of a Multi-Source Moving Objects Knowledge Base

    NARCIS (Netherlands)

    Willems, C.M.E.; van Hage, W.R.; de Vries, G.K.D.; Janssens, J.; Malaisé, V.

    2010-01-01

    We present an integrated and multidisciplinary approach for analyzing the behavior of moving objects. The results originate from an ongoing research of four different partners from the Dutch Poseidon project (Embedded Systems Institute (2007)), which aims to develop new methods for Maritime Safety

  15. An integrated approach for visual analysis of a multi-source moving objects knowledge base

    NARCIS (Netherlands)

    Willems, N.; Hage, van W.R.; Vries, de G.; Janssens, J.H.M.; Malaisé, V.

    2010-01-01

    We present an integrated and multidisciplinary approach for analyzing the behavior of moving objects. The results originate from an ongoing research of four different partners from the Dutch Poseidon project (Embedded Systems Institute (2007)), which aims to develop new methods for Maritime Safety

  16. An empirical investigation on the forecasting ability of mallows model averaging in a macro economic environment

    Science.gov (United States)

    Yin, Yip Chee; Hock-Eam, Lim

    2012-09-01

    This paper investigates the forecasting ability of Mallows Model Averaging (MMA) by conducting an empirical analysis of five Asia countries, Malaysia, Thailand, Philippines, Indonesia and China's GDP growth rate. Results reveal that MMA has no noticeable differences in predictive ability compared to the general autoregressive fractional integrated moving average model (ARFIMA) and its predictive ability is sensitive to the effect of financial crisis. MMA could be an alternative forecasting method for samples without recent outliers such as financial crisis.

  17. Performance of the Analog Moving Window Detector

    DEFF Research Database (Denmark)

    Hansen, V. Gregers

    1970-01-01

    A type of analog integrating moving window detector for use with a scanning pulse radar is examined. A performance analysis is carried out, which takes into account both the radiation pattern of the antenna and the dynamic character of the detection process due to the angular scanning...

  18. Comparing a recursive digital filter with the moving-average and sequential probability-ratio detection methods for SNM portal monitors

    International Nuclear Information System (INIS)

    Fehlau, P.E.

    1993-01-01

    The author compared a recursive digital filter proposed as a detection method for French special nuclear material monitors with the author's detection methods, which employ a moving-average scaler or a sequential probability-ratio test. Each of these nine test subjects repeatedly carried a test source through a walk-through portal monitor that had the same nuisance-alarm rate with each method. He found that the average detection probability for the test source is also the same for each method. However, the recursive digital filter may have on drawback: its exponentially decreasing response to past radiation intensity prolongs the impact of any interference from radiation sources of radiation-producing machinery. He also examined the influence of each test subject on the monitor's operation by measuring individual attenuation factors for background and source radiation, then ranked the subjects' attenuation factors against their individual probabilities for detecting the test source. The one inconsistent ranking was probably caused by that subject's unusually long stride when passing through the portal

  19. On critical cases in limit theory for stationary increments Lévy driven moving averages

    DEFF Research Database (Denmark)

    Basse-O'Connor, Andreas; Podolskij, Mark

    averages. The limit theory heavily depends on the interplay between the given order of the increments, the considered power, the Blumenthal-Getoor index of the driving pure jump Lévy process L and the behavior of the kernel function g at 0. In this work we will study the critical cases, which were...

  20. Power Based Phase-Locked Loop Under Adverse Conditions with Moving Average Filter for Single-Phase System

    Directory of Open Access Journals (Sweden)

    Menxi Xie

    2017-06-01

    Full Text Available High performance synchronization methord is citical for grid connected power converter. For single-phase system, power based phase-locked loop(pPLL uses a multiplier as phase detector(PD. As single-phase grid voltage is distorted, the phase error information contains ac disturbances oscillating at integer multiples of fundamental frequency which lead to detection error. This paper presents a new scheme based on moving average filter(MAF applied in-loop of pPLL. The signal characteristic of phase error is dissussed in detail. A predictive rule is adopted to compensate the delay induced by MAF, thus achieving fast dynamic response. In the case of frequency deviate from nomimal, estimated frequency is fed back to adjust the filter window length of MAF and buffer size of predictive rule. Simulation and experimental results show that proposed PLL achieves good performance under adverse grid conditions.

  1. Move me, astonish me… delight my eyes and brain: The Vienna Integrated Model of top-down and bottom-up processes in Art Perception (VIMAP) and corresponding affective, evaluative, and neurophysiological correlates.

    Science.gov (United States)

    Pelowski, Matthew; Markey, Patrick S; Forster, Michael; Gerger, Gernot; Leder, Helmut

    2017-07-01

    This paper has a rather audacious purpose: to present a comprehensive theory explaining, and further providing hypotheses for the empirical study of, the multiple ways by which people respond to art. Despite common agreement that interaction with art can be based on a compelling, and occasionally profound, psychological experience, the nature of these interactions is still under debate. We propose a model, The Vienna Integrated Model of Art Perception (VIMAP), with the goal of resolving the multifarious processes that can occur when we perceive and interact with visual art. Specifically, we focus on the need to integrate bottom-up, artwork-derived processes, which have formed the bulk of previous theoretical and empirical assessments, with top-down mechanisms which can describe how individuals adapt or change within their processing experience, and thus how individuals may come to particularly moving, disturbing, transformative, as well as mundane, results. This is achieved by combining several recent lines of theoretical research into a new integrated approach built around three processing checks, which we argue can be used to systematically delineate the possible outcomes in art experience. We also connect our model's processing stages to specific hypotheses for emotional, evaluative, and physiological factors, and address main topics in psychological aesthetics including provocative reactions-chills, awe, thrills, sublime-and difference between "aesthetic" and "everyday" emotional response. Finally, we take the needed step of connecting stages to functional regions in the brain, as well as broader core networks that may coincide with the proposed cognitive checks, and which taken together can serve as a basis for future empirical and theoretical art research. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Move me, astonish me… delight my eyes and brain: The Vienna Integrated Model of top-down and bottom-up processes in Art Perception (VIMAP) and corresponding affective, evaluative, and neurophysiological correlates

    Science.gov (United States)

    Pelowski, Matthew; Markey, Patrick S.; Forster, Michael; Gerger, Gernot; Leder, Helmut

    2017-07-01

    This paper has a rather audacious purpose: to present a comprehensive theory explaining, and further providing hypotheses for the empirical study of, the multiple ways by which people respond to art. Despite common agreement that interaction with art can be based on a compelling, and occasionally profound, psychological experience, the nature of these interactions is still under debate. We propose a model, The Vienna Integrated Model of Art Perception (VIMAP), with the goal of resolving the multifarious processes that can occur when we perceive and interact with visual art. Specifically, we focus on the need to integrate bottom-up, artwork-derived processes, which have formed the bulk of previous theoretical and empirical assessments, with top-down mechanisms which can describe how individuals adapt or change within their processing experience, and thus how individuals may come to particularly moving, disturbing, transformative, as well as mundane, results. This is achieved by combining several recent lines of theoretical research into a new integrated approach built around three processing checks, which we argue can be used to systematically delineate the possible outcomes in art experience. We also connect our model's processing stages to specific hypotheses for emotional, evaluative, and physiological factors, and address main topics in psychological aesthetics including provocative reactions-chills, awe, thrills, sublime-and difference between ;aesthetic; and ;everyday; emotional response. Finally, we take the needed step of connecting stages to functional regions in the brain, as well as broader core networks that may coincide with the proposed cognitive checks, and which taken together can serve as a basis for future empirical and theoretical art research.

  3. Nonlinear Autoregressive Network with the Use of a Moving Average Method for Forecasting Typhoon Tracks

    OpenAIRE

    Tienfuan Kerh; Shin-Hung Wu

    2017-01-01

    Forecasting of a typhoon moving path may help to evaluate the potential negative impacts in the neighbourhood areas along the moving path. This study proposed a work of using both static and dynamic neural network models to link a time series of typhoon track parameters including longitude and latitude of the typhoon central location, cyclonic radius, central wind speed, and typhoon moving speed. Based on the historical records of 100 typhoons, the performances of neural network models are ev...

  4. Experimental validation of heterogeneity-corrected dose-volume prescription on respiratory-averaged CT images in stereotactic body radiotherapy for moving tumors

    International Nuclear Information System (INIS)

    Nakamura, Mitsuhiro; Miyabe, Yuki; Matsuo, Yukinori; Kamomae, Takeshi; Nakata, Manabu; Yano, Shinsuke; Sawada, Akira; Mizowaki, Takashi; Hiraoka, Masahiro

    2012-01-01

    The purpose of this study was to experimentally assess the validity of heterogeneity-corrected dose-volume prescription on respiratory-averaged computed tomography (RACT) images in stereotactic body radiotherapy (SBRT) for moving tumors. Four-dimensional computed tomography (CT) data were acquired while a dynamic anthropomorphic thorax phantom with a solitary target moved. Motion pattern was based on cos (t) with a constant respiration period of 4.0 sec along the longitudinal axis of the CT couch. The extent of motion (A 1 ) was set in the range of 0.0–12.0 mm at 3.0-mm intervals. Treatment planning with the heterogeneity-corrected dose-volume prescription was designed on RACT images. A new commercially available Monte Carlo algorithm of well-commissioned 6-MV photon beam was used for dose calculation. Dosimetric effects of intrafractional tumor motion were then investigated experimentally under the same conditions as 4D CT simulation using the dynamic anthropomorphic thorax phantom, films, and an ionization chamber. The passing rate of γ index was 98.18%, with the criteria of 3 mm/3%. The dose error between the planned and the measured isocenter dose in moving condition was within ± 0.7%. From the dose area histograms on the film, the mean ± standard deviation of the dose covering 100% of the cross section of the target was 102.32 ± 1.20% (range, 100.59–103.49%). By contrast, the irradiated areas receiving more than 95% dose for A 1 = 12 mm were 1.46 and 1.33 times larger than those for A 1 = 0 mm in the coronal and sagittal planes, respectively. This phantom study demonstrated that the cross section of the target received 100% dose under moving conditions in both the coronal and sagittal planes, suggesting that the heterogeneity-corrected dose-volume prescription on RACT images is acceptable in SBRT for moving tumors.

  5. A novel integrated thermally coupled moving bed reactors for naphtha reforming process with hydrodealkylation of toluene

    International Nuclear Information System (INIS)

    Iranshahi, Davood; Saeedi, Reza; Azizi, Kolsoom; Nategh, Mahshid

    2017-01-01

    Highlights: • A novel thermally coupled reactor in CCR naphtha reforming process is modeled. • The required heat of Naphtha process is attained with toluene hydrodealkylation. • A new kinetic model involving 32 pseudo-component and 84 reactions is proposed. • The aromatics and hydrogen production increase 19% and 23%, respectively. - Abstract: Due to the importance of catalytic naphtha reforming process in refineries, development of this process to attain the highest yield of desired products is crucial. In this study, continuous catalyst regeneration naphtha reforming process with radial flow is coupled with hydrodealkylation of toluene to prevent energy loss while enhancing aromatics and hydrogen yields. In this coupled process, heat is transferred between hot and cold sections (from hydrodealkylation of toluene to catalytic naphtha reforming process) using the process integration method. A steady-state two-dimensional model, which considers coke formation on the catalyst pellets, is developed and 32 pseudo-components with 84 reactions are investigated. Kinetic model utilized for HDA process is homogeneous and non-catalytic. The modeling results reveal an approximate increase of 19% and 23% in aromatics and hydrogen molar flow rates, respectively, in comparison with conventional naphtha reforming process. The improvement in aromatics production evidently indicates that HDA is a suitable process to be coupled with naphtha reforming.

  6. A landslide-quake detection algorithm with STA/LTA and diagnostic functions of moving average and scintillation index: A preliminary case study of the 2009 Typhoon Morakot in Taiwan

    Science.gov (United States)

    Wu, Yu-Jie; Lin, Guan-Wei

    2017-04-01

    Since 1999, Taiwan has experienced a rapid rise in the number of landslides, and the number even reached a peak after the 2009 Typhoon Morakot. Although it is proved that the ground-motion signals induced by slope processes could be recorded by seismograph, it is difficult to be distinguished from continuous seismic records due to the lack of distinct P and S waves. In this study, we combine three common seismic detectors including the short-term average/long-term average (STA/LTA) approach, and two diagnostic functions of moving average and scintillation index. Based on these detectors, we have established an auto-detection algorithm of landslide-quakes and the detection thresholds are defined to distinguish landslide-quake from earthquakes and background noises. To further improve the proposed detection algorithm, we apply it to seismic archives recorded by Broadband Array in Taiwan for Seismology (BATS) during the 2009 Typhoon Morakots and consequently the discrete landslide-quakes detected by the automatic algorithm are located. The detection algorithm show that the landslide-detection results are consistent with that of visual inspection and hence can be used to automatically monitor landslide-quakes.

  7. ANALISIS CURAH HUJAN DAN DEBIT MODEL SWAT DENGAN METODE MOVING AVERAGE DI DAS CILIWUNG HULU

    Directory of Open Access Journals (Sweden)

    Defri Satiya Zuma

    2017-09-01

    Full Text Available Watershed can be regarded as a hydrological system that has a function in transforming rainwater as an input into outputs such as flow and sediment. The transformation of inputs into outputs has specific forms and properties. The transformation involves many processes, including processes occurred on the surface of the land, river basins, in soil and aquifer. This study aimed to apply the SWAT model  in  Ciliwung Hulu Watershed, asses the effect of average rainfall  on 3 days, 5 days, 7 days and 10 days of the hydrological characteristics in Ciliwung Hulu Watershed. The correlation coefficient (r between rainfall and discharge was positive, it indicated that there was an unidirectional relationship between rainfall and discharge in the upstream, midstream and downstream of the watershed. The upper limit ratio of discharge had a downward trend from upstream to downstream, while the lower limit ratio of  discharge had an upward trend from upstream to downstream. It showed that the discharge peak in Ciliwung  Hulu Watershed from upstream to downstream had a downward trend while the baseflow from upstream to downstream had an upward trend. It showed that the upstream of Ciliwung Hulu Watershed had the highest ratio of discharge peak  and baseflow so it needs the soil and water conservations and technical civil measures. The discussion concluded that the SWAT model could be well applied in Ciliwung Hulu Watershed, the most affecting average rainfall on the hydrological characteristics was the average rainfall of 10 days. On average  rainfall of 10 days, all components had contributed maximally for river discharge.

  8. Integrated Renovation Process

    DEFF Research Database (Denmark)

    Galiotto, Nicolas; Heiselberg, Per; Knudstrup, Mary-Ann

    2016-01-01

    renovation to be overcome. The homeowners were better integrated and their preferences and immaterial values were better taken into account. To keep the decision-making process economically viable and timely, the process as known today still needs to be improved, and new tools need to be developed....... This paper presents a new scheme: the integrated renovation process. One successful case study is introduced, and recommendations for future developments needed in the field are provided....

  9. Simulation of Pedestrian Behavior in the Collision-Avoidance Process considering Their Moving Preferences

    Directory of Open Access Journals (Sweden)

    Zhilu Yuan

    2017-01-01

    Full Text Available Walking habits can affect the self-organizing movement in pedestrian flow. In China, pedestrians prefer to walk along the right-hand side in the collision-avoidance process, and the same is true for the left-hand preference that is followed in several countries. Through experiments with pedestrian flow, we find that the relative position between pedestrians can affect their moving preferences. We propose a kind of collision-avoidance force based on the social force model, which considers the predictions of potential conflict and the relative position between pedestrians. In the simulation, we use the improved model to explore the effect of moving preference on the collision-avoidance process and self-organizing pedestrian movement. We conclude that the improved model can bring the simulation closer to reality and that moving preference is conducive to the self-adjustment of counterflow.

  10. DO DYNAMIC NEURAL NETWORKS STAND A BETTER CHANCE IN FRACTIONALLY INTEGRATED PROCESS FORECASTING?

    Directory of Open Access Journals (Sweden)

    Majid Delavari

    2013-04-01

    Full Text Available The main purpose of the present study was to investigate the capabilities of two generations of models such as those based on dynamic neural network (e.g., Nonlinear Neural network Auto Regressive or NNAR model and a regressive (Auto Regressive Fractionally Integrated Moving Average model which is based on Fractional Integration Approach in forecasting daily data related to the return index of Tehran Stock Exchange (TSE. In order to compare these models under similar conditions, Mean Square Error (MSE and also Root Mean Square Error (RMSE were selected as criteria for the models’ simulated out-of-sample forecasting performance. Besides, fractal markets hypothesis was examined and according to the findings, fractal structure was confirmed to exist in the time series under investigation. Another finding of the study was that dynamic artificial neural network model had the best performance in out-of-sample forecasting based on the criteria introduced for calculating forecasting error in comparison with the ARFIMA model.

  11. Integration process and logistics results; Proceso de integracion y resultados de logistica y aprovisionamiento

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2004-07-01

    The Procurement and Logistics functions have gone through a process of integration since the beginning of integrated management of Asco and Vandellos II up to the present. These are functions that are likely to be designed for delivering a single product to the rest of the organization, defined from a high level of expectations, and that admit simplifications and materialization of synergy's as they are approached from an integrated perspective. The analyzed functions are as follows: Service and Material Purchasing, Warehouse and Material Management, and Documentation and General Services Management. In all case, to accomplish the integration, objectives, procedures and information systems were unified. As for the organization, a decision was made in each case on whether or not to out source. The decisive corporate strategy to integrate, resulting in actions such as moving corporate headquarters to Vandellos II, corporate consolidation, regulation of employment and implementation of the ENDESA Group Economic Information System (SIE) , has shaped this process, which at present can be considered as practically complete. (Author)

  12. Conceptual Frameworks for the Workplace Change Adoption Process: Elements Integration from Decision Making and Learning Cycle Process.

    Science.gov (United States)

    Radin Umar, Radin Zaid; Sommerich, Carolyn M; Lavender, Steve A; Sanders, Elizabeth; Evans, Kevin D

    2018-05-14

    Sound workplace ergonomics and safety-related interventions may be resisted by employees, and this may be detrimental to multiple stakeholders. Understanding fundamental aspects of decision making, behavioral change, and learning cycles may provide insights into pathways influencing employees' acceptance of interventions. This manuscript reviews published literature on thinking processes and other topics relevant to decision making and incorporates the findings into two new conceptual frameworks of the workplace change adoption process. Such frameworks are useful for thinking about adoption in different ways and testing changes to traditional intervention implementation processes. Moving forward, it is recommended that future research focuses on systematic exploration of implementation process activities that integrate principles from the research literature on sensemaking, decision making, and learning processes. Such exploration may provide the groundwork for development of specific implementation strategies that are theoretically grounded and provide a revised understanding of how successful intervention adoption processes work.

  13. The moving-window Bayesian maximum entropy framework: estimation of PM(2.5) yearly average concentration across the contiguous United States.

    Science.gov (United States)

    Akita, Yasuyuki; Chen, Jiu-Chiuan; Serre, Marc L

    2012-09-01

    Geostatistical methods are widely used in estimating long-term exposures for epidemiological studies on air pollution, despite their limited capabilities to handle spatial non-stationarity over large geographic domains and the uncertainty associated with missing monitoring data. We developed a moving-window (MW) Bayesian maximum entropy (BME) method and applied this framework to estimate fine particulate matter (PM(2.5)) yearly average concentrations over the contiguous US. The MW approach accounts for the spatial non-stationarity, while the BME method rigorously processes the uncertainty associated with data missingness in the air-monitoring system. In the cross-validation analyses conducted on a set of randomly selected complete PM(2.5) data in 2003 and on simulated data with different degrees of missing data, we demonstrate that the MW approach alone leads to at least 17.8% reduction in mean square error (MSE) in estimating the yearly PM(2.5). Moreover, the MWBME method further reduces the MSE by 8.4-43.7%, with the proportion of incomplete data increased from 18.3% to 82.0%. The MWBME approach leads to significant reductions in estimation error and thus is recommended for epidemiological studies investigating the effect of long-term exposure to PM(2.5) across large geographical domains with expected spatial non-stationarity.

  14. The moving-window Bayesian Maximum Entropy framework: Estimation of PM2.5 yearly average concentration across the contiguous United States

    Science.gov (United States)

    Akita, Yasuyuki; Chen, Jiu-Chiuan; Serre, Marc L.

    2013-01-01

    Geostatistical methods are widely used in estimating long-term exposures for air pollution epidemiological studies, despite their limited capabilities to handle spatial non-stationarity over large geographic domains and uncertainty associated with missing monitoring data. We developed a moving-window (MW) Bayesian Maximum Entropy (BME) method and applied this framework to estimate fine particulate matter (PM2.5) yearly average concentrations over the contiguous U.S. The MW approach accounts for the spatial non-stationarity, while the BME method rigorously processes the uncertainty associated with data missingnees in the air monitoring system. In the cross-validation analyses conducted on a set of randomly selected complete PM2.5 data in 2003 and on simulated data with different degrees of missing data, we demonstrate that the MW approach alone leads to at least 17.8% reduction in mean square error (MSE) in estimating the yearly PM2.5. Moreover, the MWBME method further reduces the MSE by 8.4% to 43.7% with the proportion of incomplete data increased from 18.3% to 82.0%. The MWBME approach leads to significant reductions in estimation error and thus is recommended for epidemiological studies investigating the effect of long-term exposure to PM2.5 across large geographical domains with expected spatial non-stationarity. PMID:22739679

  15. Multivariate Pareto Minification Processes | Umar | Journal of the ...

    African Journals Online (AJOL)

    Autoregressive (AR) and autoregressive moving average (ARMA) processes with multivariate exponential (ME) distribution are presented and discussed. The theory of positive dependence is used to show that in many cases, multivariate exponential autoregressive (MEAR) and multivariate autoregressive moving average ...

  16. Job Surfing: Move On to Move Up.

    Science.gov (United States)

    Martin, Justin

    1997-01-01

    Looks at the process of switching jobs and changing careers. Discusses when to consider options and make the move as well as the need to be flexible and open minded. Provides a test for determining the chances of promotion and when to move on. (JOW)

  17. Silicon integrated circuit process

    International Nuclear Information System (INIS)

    Lee, Jong Duck

    1985-12-01

    This book introduces the process of silicon integrated circuit. It is composed of seven parts, which are oxidation process, diffusion process, ion implantation process such as ion implantation equipment, damage, annealing and influence on manufacture of integrated circuit and device, chemical vapor deposition process like silicon Epitaxy LPCVD and PECVD, photolithography process, including a sensitizer, spin, harden bake, reflection of light and problems related process, infrared light bake, wet-etch, dry etch, special etch and problems of etching, metal process like metal process like metal-silicon connection, aluminum process, credibility of aluminum and test process.

  18. Silicon integrated circuit process

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jong Duck

    1985-12-15

    This book introduces the process of silicon integrated circuit. It is composed of seven parts, which are oxidation process, diffusion process, ion implantation process such as ion implantation equipment, damage, annealing and influence on manufacture of integrated circuit and device, chemical vapor deposition process like silicon Epitaxy LPCVD and PECVD, photolithography process, including a sensitizer, spin, harden bake, reflection of light and problems related process, infrared light bake, wet-etch, dry etch, special etch and problems of etching, metal process like metal process like metal-silicon connection, aluminum process, credibility of aluminum and test process.

  19. Noise is the new signal: Moving beyond zeroth-order geomorphology (Invited)

    Science.gov (United States)

    Jerolmack, D. J.

    2010-12-01

    The last several decades have witnessed a rapid growth in our understanding of landscape evolution, led by the development of geomorphic transport laws - time- and space-averaged equations relating mass flux to some physical process(es). In statistical mechanics this approach is called mean field theory (MFT), in which complex many-body interactions are replaced with an external field that represents the average effect of those interactions. Because MFT neglects all fluctuations around the mean, it has been described as a zeroth-order fluctuation model. The mean field approach to geomorphology has enabled the development of landscape evolution models, and led to a fundamental understanding of many landform patterns. Recent research, however, has highlighted two limitations of MFT: (1) The integral (averaging) time and space scales in geomorphic systems are sometimes poorly defined and often quite large, placing the mean field approximation on uncertain footing, and; (2) In systems exhibiting fractal behavior, an integral scale does not exist - e.g., properties like mass flux are scale-dependent. In both cases, fluctuations in sediment transport are non-negligible over the scales of interest. In this talk I will synthesize recent experimental and theoretical work that confronts these limitations. Discrete element models of fluid and grain interactions show promise for elucidating transport mechanics and pattern-forming instabilities, but require detailed knowledge of micro-scale processes and are computationally expensive. An alternative approach is to begin with a reasonable MFT, and then add higher-order terms that capture the statistical dynamics of fluctuations. In either case, moving beyond zeroth-order geomorphology requires a careful examination of the origins and structure of transport “noise”. I will attempt to show how studying the signal in noise can both reveal interesting new physics, and also help to formalize the applicability of geomorphic

  20. Canada’s 2010 Tax Competitiveness Ranking: Moving to the Average but Biased Against Services

    Directory of Open Access Journals (Sweden)

    Duanjie Chen

    2011-02-01

    Full Text Available For the first time since 1975 (the year Canada’s marginal effective tax rates were first measured, Canada has become the most tax-competitive country among G-7 states with respect to taxation of capital investment. Even more remarkably, Canada accomplished this feat within a mere six years, having previously been the least taxcompetitive G-7 member. Even in comparison to strongly growing emerging economies, Canada’s 2010 marginal effective tax rate on capital is still above average. The planned reductions in federal and provincial corporate taxes by 2013 will reduce Canada’s effective tax rate on new investments to 18.4 percent, below the Organization for Economic Co-operation and Development (OECD 2010 average and close to the average of the 50 non-OECD countries studied. This remarkable change in Canada’s tax competitiveness must be maintained in the coming years, as countries are continually reducing their business taxation despite the recent fiscal pressures arising from the 2008-9 downturn in the world economy. Many countries have forged ahead with significant reforms designed to increase tax competitiveness and improve tax neutrality including Greece, Israel, Japan, New Zealand, Taiwan and the United Kingdom. The continuing bias in Canada’s corporate income tax structure favouring manufacturing and processing business warrants close scrutiny. Measured by the difference between the marginal effective tax rate on capital between manufacturing and the broad range of service sectors, Canada has the greatest gap in tax burdens between manufacturing and services among OECD countries. Surprisingly, preferential tax treatment (such as fast write-off and investment tax credits favouring only manufacturing and processing activities has become the norm in Canada, although it does not exist in most developed economies.

  1. Autoregressive-moving-average hidden Markov model for vision-based fall prediction-An application for walker robot.

    Science.gov (United States)

    Taghvaei, Sajjad; Jahanandish, Mohammad Hasan; Kosuge, Kazuhiro

    2017-01-01

    Population aging of the societies requires providing the elderly with safe and dependable assistive technologies in daily life activities. Improving the fall detection algorithms can play a major role in achieving this goal. This article proposes a real-time fall prediction algorithm based on the acquired visual data of a user with walking assistive system from a depth sensor. In the lack of a coupled dynamic model of the human and the assistive walker a hybrid "system identification-machine learning" approach is used. An autoregressive-moving-average (ARMA) model is fitted on the time-series walking data to forecast the upcoming states, and a hidden Markov model (HMM) based classifier is built on the top of the ARMA model to predict falling in the upcoming time frames. The performance of the algorithm is evaluated through experiments with four subjects including an experienced physiotherapist while using a walker robot in five different falling scenarios; namely, fall forward, fall down, fall back, fall left, and fall right. The algorithm successfully predicts the fall with a rate of 84.72%.

  2. Buried waste integrated demonstration technology integration process

    International Nuclear Information System (INIS)

    Ferguson, J.S.; Ferguson, J.E.

    1992-04-01

    A Technology integration Process was developed for the Idaho National Energy Laboratories (INEL) Buried Waste Integrated Demonstration (BWID) Program to facilitate the transfer of technology and knowledge from industry, universities, and other Federal agencies into the BWID; to successfully transfer demonstrated technology and knowledge from the BWID to industry, universities, and other Federal agencies; and to share demonstrated technologies and knowledge between Integrated Demonstrations and other Department of Energy (DOE) spread throughout the DOE Complex. This document also details specific methods and tools for integrating and transferring technologies into or out of the BWID program. The document provides background on the BWID program and technology development needs, demonstrates the direction of technology transfer, illustrates current processes for this transfer, and lists points of contact for prospective participants in the BWID technology transfer efforts. The Technology Integration Process was prepared to ensure compliance with the requirements of DOE's Office of Technology Development (OTD)

  3. Modeling Autoregressive Processes with Moving-Quantiles-Implied Nonlinearity

    Directory of Open Access Journals (Sweden)

    Isao Ishida

    2015-01-01

    Full Text Available We introduce and investigate some properties of a class of nonlinear time series models based on the moving sample quantiles in the autoregressive data generating process. We derive a test fit to detect this type of nonlinearity. Using the daily realized volatility data of Standard & Poor’s 500 (S&P 500 and several other indices, we obtained good performance using these models in an out-of-sample forecasting exercise compared with the forecasts obtained based on the usual linear heterogeneous autoregressive and other models of realized volatility.

  4. Filtering with the Centered Moving Median to Effectively Monitor Solution Processes for Safeguard Purposes

    Energy Technology Data Exchange (ETDEWEB)

    Richir, Patrice; Dzbikowicz, Zdzislaw [Institute for Transuranium Elements (ITU), Joint Research Centre (JRC), European Commission, Ispra, Varese (Italy)

    2012-06-15

    Reprocessing plants require continuous and integrated safeguards activities by inspectors of the IAEA and Euratom because of their proliferation-sensitivity as complex facilities handling large quantities of direct use nuclear material. In support of both organizations, the JRC has developed a solution monitoring software package (DAI, Data Analysis and Interpretation) which has been implemented in the main commercial European reprocessing plants and which allows enhanced monitoring of nuclear materials in the processed solutions. This tool treats data acquired from different sensor types (e.g. from pressure transducers monitoring the solution levels in tanks). Collected signals are often noisy because of the instrumentation itself and/or because of ambient and operational conditions (e.g. pumps, ventilation systems or electromagnetic interferences) and therefore require filtering. Filtering means reduction of information and has to be applied correctly to avoid misinterpretation of the process steps. This paper describes the study of some filters one of which is the centered moving median which has been revealed as a powerful tool for solution monitoring.

  5. Depth extraction method with high accuracy in integral imaging based on moving array lenslet technique

    Science.gov (United States)

    Wang, Yao-yao; Zhang, Juan; Zhao, Xue-wei; Song, Li-pei; Zhang, Bo; Zhao, Xing

    2018-03-01

    In order to improve depth extraction accuracy, a method using moving array lenslet technique (MALT) in pickup stage is proposed, which can decrease the depth interval caused by pixelation. In this method, the lenslet array is moved along the horizontal and vertical directions simultaneously for N times in a pitch to get N sets of elemental images. Computational integral imaging reconstruction method for MALT is taken to obtain the slice images of the 3D scene, and the sum modulus (SMD) blur metric is taken on these slice images to achieve the depth information of the 3D scene. Simulation and optical experiments are carried out to verify the feasibility of this method.

  6. Simple Moving Voltage Average Incremental Conductance MPPT Technique with Direct Control Method under Nonuniform Solar Irradiance Conditions

    Directory of Open Access Journals (Sweden)

    Amjad Ali

    2015-01-01

    Full Text Available A new simple moving voltage average (SMVA technique with fixed step direct control incremental conductance method is introduced to reduce solar photovoltaic voltage (VPV oscillation under nonuniform solar irradiation conditions. To evaluate and validate the performance of the proposed SMVA method in comparison with the conventional fixed step direct control incremental conductance method under extreme conditions, different scenarios were simulated. Simulation results show that in most cases SMVA gives better results with more stability as compared to traditional fixed step direct control INC with faster tracking system along with reduction in sustained oscillations and possesses fast steady state response and robustness. The steady state oscillations are almost eliminated because of extremely small dP/dV around maximum power (MP, which verify that the proposed method is suitable for standalone PV system under extreme weather conditions not only in terms of bus voltage stability but also in overall system efficiency.

  7. Averaging processes in granular flows driven by gravity

    Science.gov (United States)

    Rossi, Giulia; Armanini, Aronne

    2016-04-01

    One of the more promising theoretical frames to analyse the two-phase granular flows is offered by the similarity of their rheology with the kinetic theory of gases [1]. Granular flows can be considered a macroscopic equivalent of the molecular case: the collisions among molecules are compared to the collisions among grains at a macroscopic scale [2,3]. However there are important statistical differences in dealing with the two applications. In the two-phase fluid mechanics, there are two main types of average: the phasic average and the mass weighed average [4]. The kinetic theories assume that the size of atoms is so small, that the number of molecules in a control volume is infinite. With this assumption, the concentration (number of particles n) doesn't change during the averaging process and the two definitions of average coincide. This hypothesis is no more true in granular flows: contrary to gases, the dimension of a single particle becomes comparable to that of the control volume. For this reason, in a single realization the number of grain is constant and the two averages coincide; on the contrary, for more than one realization, n is no more constant and the two types of average lead to different results. Therefore, the ensamble average used in the standard kinetic theory (which usually is the phasic average) is suitable for the single realization, but not for several realization, as already pointed out in [5,6]. In the literature, three main length scales have been identified [7]: the smallest is the particles size, the intermediate consists in the local averaging (in order to describe some instability phenomena or secondary circulation) and the largest arises from phenomena such as large eddies in turbulence. Our aim is to solve the intermediate scale, by applying the mass weighted average, when dealing with more than one realizations. This statistical approach leads to additional diffusive terms in the continuity equation: starting from experimental

  8. The Integrated Renovation Process

    DEFF Research Database (Denmark)

    Galiotto, Nicolas; Heiselberg, Per; Knudstrup, Mary-Ann

    The Integrated Renovation Process (IRP) is a user customized methodology based on judiciously selected constructivist and interactive multi-criteria decision making methods (Galiotto, Heiselberg, & Knudstrup, 2014 (expected)). When applied for home renovation, the Integrated Renovation Process...

  9. The dynamics of stochastic processes

    DEFF Research Database (Denmark)

    Basse-O'Connor, Andreas

    In the present thesis the dynamics of stochastic processes is studied with a special attention to the semimartingale property. This is mainly motivated by the fact that semimartingales provide the class of the processes for which it is possible to define a reasonable stochastic calculus due...... to the Bichteler-Dellacherie Theorem. The semimartingale property of Gaussian processes is characterized in terms of their covariance function, spectral measure and spectral representation. In addition, representation and expansion of filtration results are provided as well. Special attention is given to moving...... average processes, and when the driving process is a Lévy or a chaos process the semimartingale property is characterized in the filtration spanned by the driving process and in the natural filtration when the latter is a Brownian motion. To obtain some of the above results an integrability of seminorm...

  10. Books average previous decade of economic misery.

    Science.gov (United States)

    Bentley, R Alexander; Acerbi, Alberto; Ormerod, Paul; Lampos, Vasileios

    2014-01-01

    For the 20(th) century since the Depression, we find a strong correlation between a 'literary misery index' derived from English language books and a moving average of the previous decade of the annual U.S. economic misery index, which is the sum of inflation and unemployment rates. We find a peak in the goodness of fit at 11 years for the moving average. The fit between the two misery indices holds when using different techniques to measure the literary misery index, and this fit is significantly better than other possible correlations with different emotion indices. To check the robustness of the results, we also analysed books written in German language and obtained very similar correlations with the German economic misery index. The results suggest that millions of books published every year average the authors' shared economic experiences over the past decade.

  11. Deweyan integration: moving beyond place attachment in elderly migration theory.

    Science.gov (United States)

    Cutchin, M P

    2001-01-01

    The fact that aging-in-place and elderly migration are intricately linked has been overlooked by behavioral approaches to elderly migration. "Humanistic" inquiry has provided important insights into aging-in-place and elderly migration as well as the connection between the two. Humanistic approaches, however, do not encapsulate the full range of experience involved in elders' lives. To move beyond humanistic research and key concepts such as place attachment, the philosophy of John Dewey is introduced. Dewey's viewpoint is merged with the geographical concept of place into what is termed "place integration." This perspective is subsequently compared with humanistic perspectives on aging-in-place and elderly migration decision-making. Fundamental differences such as temporal orientation and substantive focus are illustrated and discussed. Conclusions address the utility of such a perspective.

  12. The Prediction of Exchange Rates with the Use of Auto-Regressive Integrated Moving-Average Models

    Directory of Open Access Journals (Sweden)

    Daniela Spiesová

    2014-10-01

    Full Text Available Currency market is recently the largest world market during the existence of which there have been many theories regarding the prediction of the development of exchange rates based on macroeconomic, microeconomic, statistic and other models. The aim of this paper is to identify the adequate model for the prediction of non-stationary time series of exchange rates and then use this model to predict the trend of the development of European currencies against Euro. The uniqueness of this paper is in the fact that there are many expert studies dealing with the prediction of the currency pairs rates of the American dollar with other currency but there is only a limited number of scientific studies concerned with the long-term prediction of European currencies with the help of the integrated ARMA models even though the development of exchange rates has a crucial impact on all levels of economy and its prediction is an important indicator for individual countries, banks, companies and businessmen as well as for investors. The results of this study confirm that to predict the conditional variance and then to estimate the future values of exchange rates, it is adequate to use the ARIMA (1,1,1 model without constant, or ARIMA [(1,7,1,(1,7] model, where in the long-term, the square root of the conditional variance inclines towards stable value.

  13. An Operator-Integration-Factor Splitting (OIFS) method for Incompressible Flows in Moving Domains

    Energy Technology Data Exchange (ETDEWEB)

    Patel, Saumil S. [Argonne National Lab. (ANL), Argonne, IL (United States); Fischer, Paul F. [Argonne National Lab. (ANL), Argonne, IL (United States); Univ. of Illinois, Urbana-Champaign, IL (United States); Min, Misun [Argonne National Lab. (ANL), Argonne, IL (United States); Tomboulides, Ananias G [Argonne National Lab. (ANL), Argonne, IL (United States); Aristotle Univ., Thessaloniki (Greece)

    2017-10-21

    In this paper, we present a characteristic-based numerical procedure for simulating incompressible flows in domains with moving boundaries. Our approach utilizes an operator-integration-factor splitting technique to help produce an effcient and stable numerical scheme. Using the spectral element method and an arbitrary Lagrangian-Eulerian formulation, we investigate flows where the convective acceleration effects are non-negligible. Several examples, ranging from laminar to turbulent flows, are considered. Comparisons with a standard, semi-implicit time-stepping procedure illustrate the improved performance of the scheme.

  14. Books Average Previous Decade of Economic Misery

    Science.gov (United States)

    Bentley, R. Alexander; Acerbi, Alberto; Ormerod, Paul; Lampos, Vasileios

    2014-01-01

    For the 20th century since the Depression, we find a strong correlation between a ‘literary misery index’ derived from English language books and a moving average of the previous decade of the annual U.S. economic misery index, which is the sum of inflation and unemployment rates. We find a peak in the goodness of fit at 11 years for the moving average. The fit between the two misery indices holds when using different techniques to measure the literary misery index, and this fit is significantly better than other possible correlations with different emotion indices. To check the robustness of the results, we also analysed books written in German language and obtained very similar correlations with the German economic misery index. The results suggest that millions of books published every year average the authors' shared economic experiences over the past decade. PMID:24416159

  15. Managing the construction bidding process : a move to simpler construction plan sets

    Science.gov (United States)

    2001-01-31

    This project was conducted to determine whether construction plan sets could be significantly simplified to speed the process of moving projects to construction. The work steps included a literature review, a telephone survey of highway agencies in s...

  16. Plasma separation process: Magnet move to Oak Ridge National Laboratory

    International Nuclear Information System (INIS)

    1989-07-01

    This is the final report on the series of operations which culminated with the delivery of the Plasma Separation Process prototype magnet system (PMS) to Building K1432 at Oak Ridge National Laboratory (ORNL). This procedure included real time monitoring of the cold mass support strut strain gauges and an in-cab rider to monitor the instrumentation and direct the driver. The primary technical consideration for these precautions was the possibility of low frequency resonant vibration of the cold mass when excited by symmetrical rough road conditions at specific speeds causing excess stress levels in the support struts and consequent strut failure. A secondary consideration was the possibility of high acceleration loads due to sudden stops, severe road conditions, of impacts. The procedure for moving and transportation to ORNL included requirements for real time continuous monitoring of the eight strut stain gauges and three external accelerometers. Because the strain gauges had not been used since the original magnet cooldown, it was planned to verify their integrity during magnet warmup. The measurements made from the strut strain gauges resulted in stress values that were physically impossible. It was concluded that further evaluation was necessary to verify the usefulness of these gauges and whether they might be faulty. This was accomplished during the removal of the magnet from the building. 6 figs., 1 tab

  17. An Estimation of the Likelihood of Significant Eruptions During 2000-2009 Using Poisson Statistics on Two-Point Moving Averages of the Volcanic Time Series

    Science.gov (United States)

    Wilson, Robert M.

    2001-01-01

    Since 1750, the number of cataclysmic volcanic eruptions (volcanic explosivity index (VEI)>=4) per decade spans 2-11, with 96 percent located in the tropics and extra-tropical Northern Hemisphere. A two-point moving average of the volcanic time series has higher values since the 1860's than before, being 8.00 in the 1910's (the highest value) and 6.50 in the 1980's, the highest since the 1910's peak. Because of the usual behavior of the first difference of the two-point moving averages, one infers that its value for the 1990's will measure approximately 6.50 +/- 1, implying that approximately 7 +/- 4 cataclysmic volcanic eruptions should be expected during the present decade (2000-2009). Because cataclysmic volcanic eruptions (especially those having VEI>=5) nearly always have been associated with short-term episodes of global cooling, the occurrence of even one might confuse our ability to assess the effects of global warming. Poisson probability distributions reveal that the probability of one or more events with a VEI>=4 within the next ten years is >99 percent. It is approximately 49 percent for an event with a VEI>=5, and 18 percent for an event with a VEI>=6. Hence, the likelihood that a climatically significant volcanic eruption will occur within the next ten years appears reasonably high.

  18. Integrated knowledge translation: digging deeper, moving forward.

    Science.gov (United States)

    Kothari, Anita; Wathen, C Nadine

    2017-06-01

    Integrated knowledge translation has risen in popularity as a solution to the underuse of research in policy and practice settings. It engages knowledge users-policymakers, practitioners, patients/consumers or their advocates, and members of the wider public-in mutually beneficial research that can involve the joint development of research questions, data collection, analysis and dissemination of findings. Knowledge that is co-produced has a better chance of being implemented. The purpose of this paper is to update developments in the field of integrated knowledge translation through a deeper analysis of the approach in practice-oriented and policy-oriented health research. We present collaborative models that fall outside the scope of integrated knowledge translation, but then explore consensus-based approaches and networks as alternate sites of knowledge co-production. We discuss the need to advance the field through the development, or use, of data collection and interpretation tools that creatively engage knowledge users in the research process. Most importantly, conceptually relevant outcomes need to be identified, including ones that focus on team transformation through the co-production of knowledge. We explore some of these challenges and benefits in detail to help researchers understand what integrated knowledge translation means, and whether the approach's potential added value is worth the investment of time, energy and other resources. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  19. A One Line Derivation of DCC: Application of a Vector Random Coefficient Moving Average Process

    NARCIS (Netherlands)

    C.M. Hafner (Christian); M.J. McAleer (Michael)

    2014-01-01

    markdownabstract__Abstract__ One of the most widely-used multivariate conditional volatility models is the dynamic conditional correlation (or DCC) specification. However, the underlying stochastic process to derive DCC has not yet been established, which has made problematic the derivation of

  20. Integrable discretizations and self-adaptive moving mesh method for a coupled short pulse equation

    International Nuclear Information System (INIS)

    Feng, Bao-Feng; Chen, Junchao; Chen, Yong; Maruno, Ken-ichi; Ohta, Yasuhiro

    2015-01-01

    In the present paper, integrable semi-discrete and fully discrete analogues of a coupled short pulse (CSP) equation are constructed. The key to the construction are the bilinear forms and determinant structure of the solutions of the CSP equation. We also construct N-soliton solutions for the semi-discrete and fully discrete analogues of the CSP equations in the form of Casorati determinants. In the continuous limit, we show that the fully discrete CSP equation converges to the semi-discrete CSP equation, then further to the continuous CSP equation. Moreover, the integrable semi-discretization of the CSP equation is used as a self-adaptive moving mesh method for numerical simulations. The numerical results agree with the analytical results very well. (paper)

  1. Features of measurement and processing of vibration signals registered on the moving parts of electrical machines

    OpenAIRE

    Gyzhko, Yuri

    2011-01-01

    Measurement and processing of vibration signals registered on the moving parts of the electrical machines using the diagnostic information-measuring system that uses Bluetooth wireless standard for the transmission of the measured data from moving parts of electrical machine is discussed.

  2. Matrix product approach for the asymmetric random average process

    International Nuclear Information System (INIS)

    Zielen, F; Schadschneider, A

    2003-01-01

    We consider the asymmetric random average process which is a one-dimensional stochastic lattice model with nearest-neighbour interaction but continuous and unbounded state variables. First, the explicit functional representations, so-called beta densities, of all local interactions leading to steady states of product measure form are rigorously derived. This also completes an outstanding proof given in a previous publication. Then we present an alternative solution for the processes with factorized stationary states by using a matrix product ansatz. Due to continuous state variables we obtain a matrix algebra in the form of a functional equation which can be solved exactly

  3. Asymptotic Time Averages and Frequency Distributions

    Directory of Open Access Journals (Sweden)

    Muhammad El-Taha

    2016-01-01

    Full Text Available Consider an arbitrary nonnegative deterministic process (in a stochastic setting {X(t,  t≥0} is a fixed realization, i.e., sample-path of the underlying stochastic process with state space S=(-∞,∞. Using a sample-path approach, we give necessary and sufficient conditions for the long-run time average of a measurable function of process to be equal to the expectation taken with respect to the same measurable function of its long-run frequency distribution. The results are further extended to allow unrestricted parameter (time space. Examples are provided to show that our condition is not superfluous and that it is weaker than uniform integrability. The case of discrete-time processes is also considered. The relationship to previously known sufficient conditions, usually given in stochastic settings, will also be discussed. Our approach is applied to regenerative processes and an extension of a well-known result is given. For researchers interested in sample-path analysis, our results will give them the choice to work with the time average of a process or its frequency distribution function and go back and forth between the two under a mild condition.

  4. A new mathematical process for the calculation of average forms of teeth.

    Science.gov (United States)

    Mehl, A; Blanz, V; Hickel, R

    2005-12-01

    Qualitative visual inspections and linear metric measurements have been predominant methods for describing the morphology of teeth. No quantitative formulation exists for the description of dental features. The aim of this study was to determine and validate a mathematical process for calculation of the average form of first maxillary molars, including the general occlusal features. Stone replicas of 174 caries-free first maxillary molar crowns from young patients ranging from 6 to 9 years of age were measured 3-dimensionally with a laser scanning system at a resolution of approximately 100,000 points. Then, the average tooth was computed, which captured the common features of the molar's surface quantitatively. This new method adapts algorithms both from computer science and neuroscience to detect and associate the same features and same surface points (correspondences) between 1 reference tooth and all other teeth. In this study, the method was tested for 7 different reference teeth. The algorithm does not involve any prior knowledge about teeth and their features. Irrespective of the reference tooth used, the procedure yielded average teeth that showed nearly no differences (less than +/-30 microm). This approach provides a valid quantitative process for calculating 3-dimensional (3D) averages of occlusal surfaces of teeth even in the event of a high number of digitized surface points. Additionally, because this process detects and assigns point-wise feature correspondences between all library teeth, it may also serve as a basis for a more substantiated principal component analysis evaluating the main natural shape deviations from the 3D average.

  5. Biochemical Process Development and Integration | Bioenergy | NREL

    Science.gov (United States)

    Biochemical Process Development and Integration Biochemical Process Development and Integration Our conversion and separation processes to pilot-scale integrated process development and scale up. We also Publications Accounting for all sugar produced during integrated production of ethanol from lignocellulosic

  6. Population-averaged macaque brain atlas with high-resolution ex vivo DTI integrated into in vivo space.

    Science.gov (United States)

    Feng, Lei; Jeon, Tina; Yu, Qiaowen; Ouyang, Minhui; Peng, Qinmu; Mishra, Virendra; Pletikos, Mihovil; Sestan, Nenad; Miller, Michael I; Mori, Susumu; Hsiao, Steven; Liu, Shuwei; Huang, Hao

    2017-12-01

    Animal models of the rhesus macaque (Macaca mulatta), the most widely used nonhuman primate, have been irreplaceable in neurobiological studies. However, a population-averaged macaque brain diffusion tensor imaging (DTI) atlas, including comprehensive gray and white matter labeling as well as bony and facial landmarks guiding invasive experimental procedures, is not available. The macaque white matter tract pathways and microstructures have been rarely recorded. Here, we established a population-averaged macaque brain atlas with high-resolution ex vivo DTI integrated into in vivo space incorporating bony and facial landmarks, and delineated microstructures and three-dimensional pathways of major white matter tracts in vivo MRI/DTI and ex vivo (postmortem) DTI of ten rhesus macaque brains were acquired. Single-subject macaque brain DTI template was obtained by transforming the postmortem high-resolution DTI data into in vivo space. Ex vivo DTI of ten macaque brains was then averaged in the in vivo single-subject template space to generate population-averaged macaque brain DTI atlas. The white matter tracts were traced with DTI-based tractography. One hundred and eighteen neural structures including all cortical gyri, white matter tracts and subcortical nuclei, were labeled manually on population-averaged DTI-derived maps. The in vivo microstructural metrics of fractional anisotropy, axial, radial and mean diffusivity of the traced white matter tracts were measured. Population-averaged digital atlas integrated into in vivo space can be used to label the experimental macaque brain automatically. Bony and facial landmarks will be available for guiding invasive procedures. The DTI metric measurements offer unique insights into heterogeneous microstructural profiles of different white matter tracts.

  7. The LEAN Payload Integration Process

    Science.gov (United States)

    Jordan, Lee P.; Young, Yancy; Rice, Amanda

    2011-01-01

    It is recognized that payload development and integration with the International Space Station (ISS) can be complex. This streamlined integration approach is a first step toward simplifying payload integration; making it easier to fly payloads on ISS, thereby increasing feasibility and interest for more research and commercial organizations to sponsor ISS payloads and take advantage of the ISS as a National Laboratory asset. The streamlined integration approach was addressed from the perspective of highly likely initial payload types to evolve from the National Lab Pathfinder program. Payloads to be accommodated by the Expedite the Processing of Experiments for Space Station (EXPRESS) Racks and Microgravity Sciences Glovebox (MSG) pressurized facilities have been addressed. It is hoped that the streamlined principles applied to these types of payloads will be analyzed and implemented in the future for other host facilities as well as unpressurized payloads to be accommodated by the EXPRESS Logistics Carrier (ELC). Further, a payload does not have to be classified as a National Lab payload in order to be processed according to the lean payload integration process; any payload that meets certain criteria can follow the lean payload integration process.

  8. Integration of a neuroimaging processing pipeline into a pan-canadian computing grid

    International Nuclear Information System (INIS)

    Lavoie-Courchesne, S; Chouinard-Decorte, F; Doyon, J; Bellec, P; Rioux, P; Sherif, T; Rousseau, M-E; Das, S; Adalat, R; Evans, A C; Craddock, C; Margulies, D; Chu, C; Lyttelton, O

    2012-01-01

    The ethos of the neuroimaging field is quickly moving towards the open sharing of resources, including both imaging databases and processing tools. As a neuroimaging database represents a large volume of datasets and as neuroimaging processing pipelines are composed of heterogeneous, computationally intensive tools, such open sharing raises specific computational challenges. This motivates the design of novel dedicated computing infrastructures. This paper describes an interface between PSOM, a code-oriented pipeline development framework, and CBRAIN, a web-oriented platform for grid computing. This interface was used to integrate a PSOM-compliant pipeline for preprocessing of structural and functional magnetic resonance imaging into CBRAIN. We further tested the capacity of our infrastructure to handle a real large-scale project. A neuroimaging database including close to 1000 subjects was preprocessed using our interface and publicly released to help the participants of the ADHD-200 international competition. This successful experiment demonstrated that our integrated grid-computing platform is a powerful solution for high-throughput pipeline analysis in the field of neuroimaging.

  9. Human-Systems Integration Processes

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of this project is to baseline a Human-Systems Integration Processes (HSIP) document as a companion to the NASA-STD-3001 and Human Integration Design...

  10. Analyzing inflation in Nigeria: a fractionally integrated ARFIMA ...

    African Journals Online (AJOL)

    The study looked into the stochastic properties of CPI-inflation rate for Nigeria from 1995Q1 to 2016Q4. The study employed an autoregressive fractionally integrated moving average and a general autoregressive conditional heteroskedasticity (ARFIMA-GARCH) methodology as well as ADF/KPSS to investigate the ...

  11. MOTION ARTIFACT REDUCTION IN FUNCTIONAL NEAR INFRARED SPECTROSCOPY SIGNALS BY AUTOREGRESSIVE MOVING AVERAGE MODELING BASED KALMAN FILTERING

    Directory of Open Access Journals (Sweden)

    MEHDI AMIAN

    2013-10-01

    Full Text Available Functional near infrared spectroscopy (fNIRS is a technique that is used for noninvasive measurement of the oxyhemoglobin (HbO2 and deoxyhemoglobin (HHb concentrations in the brain tissue. Since the ratio of the concentration of these two agents is correlated with the neuronal activity, fNIRS can be used for the monitoring and quantifying the cortical activity. The portability of fNIRS makes it a good candidate for studies involving subject's movement. The fNIRS measurements, however, are sensitive to artifacts generated by subject's head motion. This makes fNIRS signals less effective in such applications. In this paper, the autoregressive moving average (ARMA modeling of the fNIRS signal is proposed for state-space representation of the signal which is then fed to the Kalman filter for estimating the motionless signal from motion corrupted signal. Results are compared to the autoregressive model (AR based approach, which has been done previously, and show that the ARMA models outperform AR models. We attribute it to the richer structure, containing more terms indeed, of ARMA than AR. We show that the signal to noise ratio (SNR is about 2 dB higher for ARMA based method.

  12. Adaptive memory: the survival scenario enhances item-specific processing relative to a moving scenario.

    Science.gov (United States)

    Burns, Daniel J; Hart, Joshua; Griffith, Samantha E; Burns, Amy D

    2013-01-01

    Nairne, Thompson, and Pandeirada (2007) found that retention of words rated for their relevance to survival is superior to that of words encoded under numerous other deep processing conditions. They suggested that our memory systems might have evolved to confer an advantage for survival-relevant information. Burns, Burns, and Hwang (2011) suggested a two-process explanation of the proximate mechanisms responsible for the survival advantage. Whereas most control tasks encourage only one type of processing, the survival task encourages both item-specific and relational processing. They found that when control tasks encouraged both types of processing, the survival processing advantage was eliminated. However, none of their control conditions included non-survival scenarios (e.g., moving, vacation, etc.), so it is not clear how this two-process explanation would explain the survival advantage when scenarios are used as control conditions. The present experiments replicated the finding that the survival scenario improves recall relative to a moving scenario in both a between-lists and within-list design and also provided evidence that this difference was accompanied by an item-specific processing difference, not a difference in relational processing. The implications of these results for several existing accounts of the survival processing effect are discussed.

  13. Modeling an Application's Theoretical Minimum and Average Transactional Response Times

    Energy Technology Data Exchange (ETDEWEB)

    Paiz, Mary Rose [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-04-01

    The theoretical minimum transactional response time of an application serves as a ba- sis for the expected response time. The lower threshold for the minimum response time represents the minimum amount of time that the application should take to complete a transaction. Knowing the lower threshold is beneficial in detecting anomalies that are re- sults of unsuccessful transactions. On the converse, when an application's response time falls above an upper threshold, there is likely an anomaly in the application that is causing unusual performance issues in the transaction. This report explains how the non-stationary Generalized Extreme Value distribution is used to estimate the lower threshold of an ap- plication's daily minimum transactional response time. It also explains how the seasonal Autoregressive Integrated Moving Average time series model is used to estimate the upper threshold for an application's average transactional response time.

  14. Integration of motion energy from overlapping random background noise increases perceived speed of coherently moving stimuli.

    Science.gov (United States)

    Chuang, Jason; Ausloos, Emily C; Schwebach, Courtney A; Huang, Xin

    2016-12-01

    The perception of visual motion can be profoundly influenced by visual context. To gain insight into how the visual system represents motion speed, we investigated how a background stimulus that did not move in a net direction influenced the perceived speed of a center stimulus. Visual stimuli were two overlapping random-dot patterns. The center stimulus moved coherently in a fixed direction, whereas the background stimulus moved randomly. We found that human subjects perceived the speed of the center stimulus to be significantly faster than its veridical speed when the background contained motion noise. Interestingly, the perceived speed was tuned to the noise level of the background. When the speed of the center stimulus was low, the highest perceived speed was reached when the background had a low level of motion noise. As the center speed increased, the peak perceived speed was reached at a progressively higher background noise level. The effect of speed overestimation required the center stimulus to overlap with the background. Increasing the background size within a certain range enhanced the effect, suggesting spatial integration. The speed overestimation was significantly reduced or abolished when the center stimulus and the background stimulus had different colors, or when they were placed at different depths. When the center- and background-stimuli were perceptually separable, speed overestimation was correlated with perceptual similarity between the center- and background-stimuli. These results suggest that integration of motion energy from random motion noise has a significant impact on speed perception. Our findings put new constraints on models regarding the neural basis of speed perception. Copyright © 2016 the American Physiological Society.

  15. Tetrafluoride uranium pilot plant in operation at IEA, using the moving bed process

    International Nuclear Information System (INIS)

    Franca Junior, J.M.

    1975-01-01

    A UF 4 pilot plant, in operation at IEA, using the moving bed process is reported. UO 3 obtained from the thermal decomposition of ADU is used as a starting material in this pilot plant. The type of equipment and the process are both described. Ammonia gas (NH 3 ) was used in the reduction operation and anhydrous hydrofluoric acid (HF) in the hydrofluorination step

  16. The Integrated Renovation Process

    DEFF Research Database (Denmark)

    Galiotto, Nicolas

    and constructivist multiple criteria decision-making analysis method is selected for developing the work further. The method is introduced and applied to the renovation of a multi-residential historic building. Furthermore, a new scheme, the Integrated Renovation Process, is presented. Finally, the methodology...... is applied to two single-family homes. In practice, such a scheme allowed most informational barriers to sustainable home renovation to be overcome. The homeowners were better integrated and their preferences and immaterial values were better taken into account. They assimilated the multiple benefits...... to keep the decision making process economically viable and timely, the process still needs to be improved and new tools need to be developed....

  17. Integrated durability process in product development

    International Nuclear Information System (INIS)

    Pompetzki, M.; Saadetian, H.

    2002-01-01

    This presentation describes the integrated durability process in product development. Each of the major components of the integrated process are described along with a number of examples of how integrated durability assessment has been used in the ground vehicle industry. The durability process starts with the acquisition of loading information, either physically through loads measurement or virtually through multibody dynamics. The loading information is then processed and characterized for further analysis. Durability assessment was historically test based and completed through field or laboratory evaluation. Today, it is common that both the test and CAE environments are used together in durability assessment. Test based durability assessment is used for final design sign-off but is also critically important for correlating CAE models, in order to investigate design alternatives. There is also a major initiative today to integrate the individual components into a process, by linking applications and providing a framework to communicate information as well as manage all the data involved in the entire process. Although a single process is presented, the details of the process can vary significantly for different products and applications. Recent applications that highlight different parts of the durability process are given. As well as an example of how integration of software tools between different disciplines (MBD, FE and fatigue) not only simplifies the process, but also significantly improves it. (author)

  18. Establishment and assessment of an integrated citric acid-methane production process.

    Science.gov (United States)

    Xu, Jian; Chen, Yang-Qiu; Zhang, Hong-Jian; Bao, Jia-Wei; Tang, Lei; Wang, Ke; Zhang, Jian-Hua; Chen, Xu-Sheng; Mao, Zhong-Gui

    2015-01-01

    To solve the problem of extraction wastewater in citric acid industrial production, an improved integrated citric acid-methane production process was established in this study. Extraction wastewater was treated by anaerobic digestion and then the anaerobic digestion effluent (ADE) was stripped by air to remove ammonia. Followed by solid-liquid separation to remove metal ion precipitation, the supernatant was recycled for the next batch of citric acid fermentation, thus eliminating wastewater discharge and reducing water consumption. 130U/g glucoamylase was added to medium after inoculation and the recycling process performed for 10 batches. Fermentation time decreased by 20% in recycling and the average citric acid production (2nd-10th) was 145.9±3.4g/L, only 2.5% lower than that with tap water (149.6g/L). The average methane production was 292.3±25.1mL/g CODremoved and stable in operation. Excessive Na(+) concentration in ADE was confirmed to be the major challenge for the proposed process. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. A framework about flow measurements by LDA–PDA as a spatio-temporal average: application to data post-processing

    International Nuclear Information System (INIS)

    Calvo, Esteban; García, Juan A; García, Ignacio; Aísa, Luis; Santolaya, José Luis

    2012-01-01

    method and the cross-section integral calibration method. Finally, a physical interpretation of the statistical reconstruction process is provided: it is a spatio-temporal averaging of the detected particle data, and some of the algorithms used are related to the Eulerian–Eulerian mathematical description of multiphase flows. (paper)

  20. A framework about flow measurements by LDA-PDA as a spatio-temporal average: application to data post-processing

    Science.gov (United States)

    Calvo, Esteban; García, Juan A.; Santolaya, José Luis; García, Ignacio; Aísa, Luis

    2012-05-01

    method and the cross-section integral calibration method. Finally, a physical interpretation of the statistical reconstruction process is provided: it is a spatio-temporal averaging of the detected particle data, and some of the algorithms used are related to the Eulerian-Eulerian mathematical description of multiphase flows.

  1. Developing engineering processes through integrated modelling of product and process

    DEFF Research Database (Denmark)

    Nielsen, Jeppe Bjerrum; Hvam, Lars

    2012-01-01

    This article aims at developing an operational tool for integrated modelling of product assortments and engineering processes in companies making customer specific products. Integrating a product model in the design of engineering processes will provide a deeper understanding of the engineering...... activities as well as insight into how product features affect the engineering processes. The article suggests possible ways of integrating models of products with models of engineering processes. The models have been tested and further developed in an action research study carried out in collaboration...... with a major international engineering company....

  2. Original article Functioning of memory and attention processes in children with intelligence below average

    Directory of Open Access Journals (Sweden)

    Aneta Rita Borkowska

    2014-05-01

    Full Text Available BACKGROUND The aim of the research was to assess memorization and recall of logically connected and unconnected material, coded graphically and linguistically, and the ability to focus attention, in a group of children with intelligence below average, compared to children with average intelligence. PARTICIPANTS AND PROCEDURE The study group included 27 children with intelligence below average. The control group consisted of 29 individuals. All of them were examined using the authors’ experimental trials and the TUS test (Attention and Perceptiveness Test. RESULTS Children with intelligence below average memorized significantly less information contained in the logical material, demonstrated lower ability to memorize the visual material, memorized significantly fewer words in the verbal material learning task, achieved lower results in such indicators of the visual attention process pace as the number of omissions and mistakes, and had a lower pace of perceptual work, compared to children with average intelligence. CONCLUSIONS The results confirm that children with intelligence below average have difficulties with memorizing new material, both logically connected and unconnected. The significantly lower capacity of direct memory is independent of modality. The results of the study on the memory process confirm the hypothesis about lower abilities of children with intelligence below average, in terms of concentration, work pace, efficiency and perception.

  3. Averaging models: parameters estimation with the R-Average procedure

    Directory of Open Access Journals (Sweden)

    S. Noventa

    2010-01-01

    Full Text Available The Functional Measurement approach, proposed within the theoretical framework of Information Integration Theory (Anderson, 1981, 1982, can be a useful multi-attribute analysis tool. Compared to the majority of statistical models, the averaging model can account for interaction effects without adding complexity. The R-Average method (Vidotto & Vicentini, 2007 can be used to estimate the parameters of these models. By the use of multiple information criteria in the model selection procedure, R-Average allows for the identification of the best subset of parameters that account for the data. After a review of the general method, we present an implementation of the procedure in the framework of R-project, followed by some experiments using a Monte Carlo method.

  4. Development of a higher-order finite volume method for simulation of thermal oil recovery process using moving mesh strategy

    Energy Technology Data Exchange (ETDEWEB)

    Ahmadi, M. [Heriot Watt Univ., Edinburgh (United Kingdom)

    2008-10-15

    This paper described a project in which a higher order up-winding scheme was used to solve mass/energy conservation equations for simulating steam flood processes in an oil reservoir. Thermal recovery processes are among the most complex because they require a detailed accounting of thermal energy and chemical reaction kinetics. The numerical simulation of thermal recovery processes involves localized phenomena such as saturation and temperatures fronts due to hyperbolic features of governing conservation laws. A second order accurate FV method that was improved by a moving mesh strategy was used to adjust for moving coordinates on a finely gridded domain. The Finite volume method was used and the problem of steam injection was then tested using derived solution frameworks on both mixed and moving coordinates. The benefits of using a higher-order Godunov solver instead of lower-order ones were qualified. This second order correction resulted in better resolution on moving features. Preferences of higher-order solvers over lower-order ones in terms of shock capturing is under further investigation. It was concluded that although this simulation study was limited to steam flooding processes, the newly presented approach may be suitable to other enhanced oil recovery processes such as VAPEX, SAGD and in situ combustion processes. 23 refs., 28 figs.

  5. To quantum averages through asymptotic expansion of classical averages on infinite-dimensional space

    International Nuclear Information System (INIS)

    Khrennikov, Andrei

    2007-01-01

    We study asymptotic expansions of Gaussian integrals of analytic functionals on infinite-dimensional spaces (Hilbert and nuclear Frechet). We obtain an asymptotic equality coupling the Gaussian integral and the trace of the composition of scaling of the covariation operator of a Gaussian measure and the second (Frechet) derivative of a functional. In this way we couple classical average (given by an infinite-dimensional Gaussian integral) and quantum average (given by the von Neumann trace formula). We can interpret this mathematical construction as a procedure of 'dequantization' of quantum mechanics. We represent quantum mechanics as an asymptotic projection of classical statistical mechanics with infinite-dimensional phase space. This space can be represented as the space of classical fields, so quantum mechanics is represented as a projection of 'prequantum classical statistical field theory'

  6. A comparison of moving object detection methods for real-time moving object detection

    Science.gov (United States)

    Roshan, Aditya; Zhang, Yun

    2014-06-01

    Moving object detection has a wide variety of applications from traffic monitoring, site monitoring, automatic theft identification, face detection to military surveillance. Many methods have been developed across the globe for moving object detection, but it is very difficult to find one which can work globally in all situations and with different types of videos. The purpose of this paper is to evaluate existing moving object detection methods which can be implemented in software on a desktop or laptop, for real time object detection. There are several moving object detection methods noted in the literature, but few of them are suitable for real time moving object detection. Most of the methods which provide for real time movement are further limited by the number of objects and the scene complexity. This paper evaluates the four most commonly used moving object detection methods as background subtraction technique, Gaussian mixture model, wavelet based and optical flow based methods. The work is based on evaluation of these four moving object detection methods using two (2) different sets of cameras and two (2) different scenes. The moving object detection methods have been implemented using MatLab and results are compared based on completeness of detected objects, noise, light change sensitivity, processing time etc. After comparison, it is observed that optical flow based method took least processing time and successfully detected boundary of moving objects which also implies that it can be implemented for real-time moving object detection.

  7. Analytical explicit formulas of average run length for long memory process with ARFIMA model on CUSUM control chart

    Directory of Open Access Journals (Sweden)

    Wilasinee Peerajit

    2017-12-01

    Full Text Available This paper proposes the explicit formulas for the derivation of exact formulas from Average Run Lengths (ARLs using integral equation on CUSUM control chart when observations are long memory processes with exponential white noise. The authors compared efficiency in terms of the percentage of absolute difference to a similar method to verify the accuracy of the ARLs between the values obtained by the explicit formulas and numerical integral equation (NIE method. The explicit formulas were based on Banach fixed point theorem which was used to guarantee the existence and uniqueness of the solution for ARFIMA(p,d,q. Results showed that the two methods are similar in good agreement with the percentage of absolute difference at less than 0.23%. Therefore, the explicit formulas are an efficient alternative for implementation in real applications because the computational CPU time for ARLs from the explicit formulas are 1 second preferable over the NIE method.

  8. An integrated mathematical model for chemical oxygen demand (COD) removal in moving bed biofilm reactors (MBBR) including predation and hydrolysis.

    Science.gov (United States)

    Revilla, Marta; Galán, Berta; Viguri, Javier R

    2016-07-01

    An integrated mathematical model is proposed for modelling a moving bed biofilm reactor (MBBR) for removal of chemical oxygen demand (COD) under aerobic conditions. The composite model combines the following: (i) a one-dimensional biofilm model, (ii) a bulk liquid model, and (iii) biological processes in the bulk liquid and biofilm considering the interactions among autotrophic, heterotrophic and predator microorganisms. Depending on the values for the soluble biodegradable COD loading rate (SCLR), the model takes into account a) the hydrolysis of slowly biodegradable compounds in the bulk liquid, and b) the growth of predator microorganisms in the bulk liquid and in the biofilm. The integration of the model and the SCLR allows a general description of the behaviour of COD removal by the MBBR under various conditions. The model is applied for two in-series MBBR wastewater plant from an integrated cellulose and viscose production and accurately describes the experimental concentrations of COD, total suspended solids (TSS), nitrogen and phosphorous obtained during 14 months working at different SCLRs and nutrient dosages. The representation of the microorganism group distribution in the biofilm and in the bulk liquid allow for verification of the presence of predator microorganisms in the second reactor under some operational conditions. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Method and apparatus for a combination moving bed thermal treatment reactor and moving bed filter

    Energy Technology Data Exchange (ETDEWEB)

    Badger, Phillip C.; Dunn, Jr., Kenneth J.

    2015-09-01

    A moving bed gasification/thermal treatment reactor includes a geometry in which moving bed reactor particles serve as both a moving bed filter and a heat carrier to provide thermal energy for thermal treatment reactions, such that the moving bed filter and the heat carrier are one and the same to remove solid particulates or droplets generated by thermal treatment processes or injected into the moving bed filter from other sources.

  10. Integrated Monitoring System of Production Processes

    Directory of Open Access Journals (Sweden)

    Oborski Przemysław

    2016-12-01

    Full Text Available Integrated monitoring system for discrete manufacturing processes is presented in the paper. The multilayer hardware and software reference model was developed. Original research are an answer for industry needs of the integration of information flow in production process. Reference model corresponds with proposed data model based on multilayer data tree allowing to describe orders, products, processes and save monitoring data. Elaborated models were implemented in the integrated monitoring system demonstrator developed in the project. It was built on the base of multiagent technology to assure high flexibility and openness on applying intelligent algorithms for data processing. Currently on the base of achieved experience an application integrated monitoring system for real production system is developed. In the article the main problems of monitoring integration are presented, including specificity of discrete production, data processing and future application of Cyber-Physical-Systems. Development of manufacturing systems is based more and more on taking an advantage of applying intelligent solutions into machine and production process control and monitoring. Connection of technical systems, machine tools and manufacturing processes monitoring with advanced information processing seems to be one of the most important areas of near future development. It will play important role in efficient operation and competitiveness of the whole production system. It is also important area of applying in the future Cyber-Physical-Systems that can radically improve functionally of monitoring systems and reduce the cost of its implementation.

  11. Biomass Torrefaction Process Review and Moving Bed Torrefaction System Model Development

    Energy Technology Data Exchange (ETDEWEB)

    Jaya Shakar Tumuluru; Shahab Sokhansanj; Christopher T. Wright; Richard D. Boardman

    2010-08-01

    Torrefaction is currently developing as an important preprocessing step to improve the quality of biomass in terms of physical properties, and proximate and ultimate composition. Torrefaction is a slow heating of biomass in an inert or reduced environment to a maximum temperature of 300 C. Torrefaction can also be defined as a group of products resulting from the partially controlled and isothermal pyrolysis of biomass occurring in a temperature range of 200-230 C and 270-280 C. Thus, the process can also be called a mild pyrolysis as it occurs at the lower temperature range of the pyrolysis process. At the end of the torrefaction process, a solid uniform product with lower moisture content and higher energy content than raw biomass is produced. Most of the smoke-producing compounds and other volatiles are removed during torrefaction, producing a final product that will have a lower mass but a higher heating value. An important aspect of research is to establish a degree of torrefaction where gains in heating value offset the loss of mass. There is a lack of literature on torrefaction reactor designs and a design sheet for estimating the dimensions of the torrefier based on capacity. This study includes (a) conducting a detailed review on the torrefaction of biomass in terms of understanding the process, product properties, off-gas compositions, and methods used, and (b) to design a moving bed torrefier, taking into account the basic fundamental heat and mass transfer calculations. Specific objectives include calculating the dimensions like diameter and height of the moving packed bed for different capacities, designing the heat loads and gas flow rates, and developing an interactive excel sheet where the user can define design specifications. In this report, 25-1000 kg/hr are used in equations for the design of the torrefier, examples of calculations, and specifications for the torrefier.

  12. Business process technology and the cloud : defining a business process cloud platform

    NARCIS (Netherlands)

    Stoitsev, V.; Grefen, P.W.P.J.

    2012-01-01

    The present state of the integration between business process technology and the Cloud is vague and not well defined. Industry research organizations predict that enterprises will be moving in both these directions in the next few years. This will increase the need for a clear integration between

  13. Asymptotically optimum multialternative sequential procedures for discernment of processes minimizing average length of observations

    Science.gov (United States)

    Fishman, M. M.

    1985-01-01

    The problem of multialternative sequential discernment of processes is formulated in terms of conditionally optimum procedures minimizing the average length of observations, without any probabilistic assumptions about any one occurring process, rather than in terms of Bayes procedures minimizing the average risk. The problem is to find the procedure that will transform inequalities into equalities. The problem is formulated for various models of signal observation and data processing: (1) discernment of signals from background interference by a multichannel system; (2) discernment of pulse sequences with unknown time delay; (3) discernment of harmonic signals with unknown frequency. An asymptotically optimum sequential procedure is constructed which compares the statistics of the likelihood ratio with the mean-weighted likelihood ratio and estimates the upper bound for conditional average lengths of observations. This procedure is shown to remain valid as the upper bound for the probability of erroneous partial solutions decreases approaching zero and the number of hypotheses increases approaching infinity. It also remains valid under certain special constraints on the probability such as a threshold. A comparison with a fixed-length procedure reveals that this sequential procedure decreases the length of observations to one quarter, on the average, when the probability of erroneous partial solutions is low.

  14. Designing components using smartMOVE electroactive polymer technology

    Science.gov (United States)

    Rosenthal, Marcus; Weaber, Chris; Polyakov, Ilya; Zarrabi, Al; Gise, Peter

    2008-03-01

    Designing components using SmartMOVE TM electroactive polymer technology requires an understanding of the basic operation principles and the necessary design tools for integration into actuator, sensor and energy generation applications. Artificial Muscle, Inc. is collaborating with OEMs to develop customized solutions for their applications using smartMOVE. SmartMOVE is an advanced and elegant way to obtain almost any kind of movement using dielectric elastomer electroactive polymers. Integration of this technology offers the unique capability to create highly precise and customized motion for devices and systems that require actuation. Applications of SmartMOVE include linear actuators for medical, consumer and industrial applications, such as pumps, valves, optical or haptic devices. This paper will present design guidelines for selecting a smartMOVE actuator design to match the stroke, force, power, size, speed, environmental and reliability requirements for a range of applications. Power supply and controller design and selection will also be introduced. An overview of some of the most versatile configuration options will be presented with performance comparisons. A case example will include the selection, optimization, and performance overview of a smartMOVE actuator for the cell phone camera auto-focus and proportional valve applications.

  15. Ultrasound image based visual servoing for moving target ablation by high intensity focused ultrasound.

    Science.gov (United States)

    Seo, Joonho; Koizumi, Norihiro; Mitsuishi, Mamoru; Sugita, Naohiko

    2017-12-01

    Although high intensity focused ultrasound (HIFU) is a promising technology for tumor treatment, a moving abdominal target is still a challenge in current HIFU systems. In particular, respiratory-induced organ motion can reduce the treatment efficiency and negatively influence the treatment result. In this research, we present: (1) a methodology for integration of ultrasound (US) image based visual servoing in a HIFU system; and (2) the experimental results obtained using the developed system. In the visual servoing system, target motion is monitored by biplane US imaging and tracked in real time (40 Hz) by registration with a preoperative 3D model. The distance between the target and the current HIFU focal position is calculated in every US frame and a three-axis robot physically compensates for differences. Because simultaneous HIFU irradiation disturbs US target imaging, a sophisticated interlacing strategy was constructed. In the experiments, respiratory-induced organ motion was simulated in a water tank with a linear actuator and kidney-shaped phantom model. Motion compensation with HIFU irradiation was applied to the moving phantom model. Based on the experimental results, visual servoing exhibited a motion compensation accuracy of 1.7 mm (RMS) on average. Moreover, the integrated system could make a spherical HIFU-ablated lesion in the desired position of the respiratory-moving phantom model. We have demonstrated the feasibility of our US image based visual servoing technique in a HIFU system for moving target treatment. © 2016 The Authors The International Journal of Medical Robotics and Computer Assisted Surgery Published by John Wiley & Sons Ltd.

  16. Logistics integration processes in the food industry

    OpenAIRE

    Giménez, Cristina

    2003-01-01

    This paper analyses the integration process that firms follow to implement Supply Chain Management (SCM). This study has been inspired in the integration model proposed by Stevens (1989). He suggests that companies internally integrate first and then extend integration to other supply chain members, such as customers and suppliers. To analyse the integration process a survey was conducted among Spanish food manufacturers. The results show that there are companies in three different integratio...

  17. An integrated multi-stage supply chain inventory model with imperfect production process

    Directory of Open Access Journals (Sweden)

    Soumita Kundu

    2015-09-01

    Full Text Available This paper deals with an integrated multi-stage supply chain inventory model with the objective of cost minimization by synchronizing the replenishment decisions for procurement, production and delivery activities. The supply chain structure examined here consists of a single manufacturer with multi-buyer where manufacturer orders a fixed quantity of raw material from outside suppliers, processes the materials and delivers the finished products in unequal shipments to each customer. In this paper, we consider an imperfect production system, which produces defective items randomly and assumes that all defective items could be reworked. A simple algorithm is developed to obtain an optimal production policy, which minimizes the expected average total cost of the integrated production-inventory system.

  18. Cooperative Scalable Moving Continuous Query Processing

    DEFF Research Database (Denmark)

    Li, Xiaohui; Karras, Panagiotis; Jensen, Christian S.

    2012-01-01

    of the global view and handle the majority of the workload. Meanwhile, moving clients, having basic memory and computation resources, handle small portions of the workload. This model is further enhanced by dynamic region allocation and grid size adjustment mechanisms that reduce the communication...... and computation cost for both servers and clients. An experimental study demonstrates that our approaches offer better scalability than competitors...

  19. Integrated modelling in materials and process technology

    DEFF Research Database (Denmark)

    Hattel, Jesper Henri

    2008-01-01

    Integrated modelling of entire process sequences and the subsequent in-service conditions, and multiphysics modelling of the single process steps are areas that increasingly support optimisation of manufactured parts. In the present paper, three different examples of modelling manufacturing...... processes from the viewpoint of combined materials and process modelling are presented: solidification of thin walled ductile cast iron, integrated modelling of spray forming and multiphysics modelling of friction stir welding. The fourth example describes integrated modelling applied to a failure analysis...

  20. Integrated systems of monitoring and environmental data processing for nuclear facilities

    International Nuclear Information System (INIS)

    Diaconu, C.; Guta, V.; Oprea, I.; Oprea, M.; Stoica, M.; Pirvu, V.; Vasilache, E.; Pirvu, I.

    2001-01-01

    The processing of huge amount of data necessary to assess the real radiological situation both in normal operational conditions and during accidents requires an efficient system of monitoring and data processing. It must be able to secure information for the complex systems of radioactivity control aiming at evaluating the nuclear accident consequences and establishing a basis for correct decision making in the field of civil protection. The integrated environmental monitoring systems are based on a number of fixed and mobile installations, a meteorological parameter measurement station, a center for data processing and a communication network, working all under the control of a real-time operation system. They collect, and process the radioactivity level and meteorological data and transmit them through the communication network. The local monitoring stations are made of detector ensembles with pressurized ionization chambers and autonomous units providing continuously information on dose and integrated rates, average values as well as the current state of the station. The meteorological data acquisition station supplies information concerning wind direction and speed, the temperature and precipitation level. The information processing center is based on a PC integrated in a local network which collects data from the radiation monitoring equipment, meteorological station as well as other work stations which process various dosimetric parameters. It is connected to Internet, so ensuring fast transfer of information towards interested authorities. The communication network consists in a local or extended Ethernet network, radio or serial connections for radioactivity level monitoring units which can be stationary, portable or mobile. Requirements raised by the application of geographic information system (GIS) and the real time operation system (QNX) ensuring multiuser and multitask operations are discussed

  1. Business process technology and the cloud : defining a business process cloud platform

    OpenAIRE

    Stoitsev, V.; Grefen, P.W.P.J.

    2012-01-01

    The present state of the integration between business process technology and the Cloud is vague and not well defined. Industry research organizations predict that enterprises will be moving in both these directions in the next few years. This will increase the need for a clear integration between these two areas. Apart from this, many current problems with automated business processes stem from the poor connection between business application systems and the needed business process support, a...

  2. Timescale Halo: Average-Speed Targets Elicit More Positive and Less Negative Attributions than Slow or Fast Targets

    Science.gov (United States)

    Hernandez, Ivan; Preston, Jesse Lee; Hepler, Justin

    2014-01-01

    Research on the timescale bias has found that observers perceive more capacity for mind in targets moving at an average speed, relative to slow or fast moving targets. The present research revisited the timescale bias as a type of halo effect, where normal-speed people elicit positive evaluations and abnormal-speed (slow and fast) people elicit negative evaluations. In two studies, participants viewed videos of people walking at a slow, average, or fast speed. We find evidence for a timescale halo effect: people walking at an average-speed were attributed more positive mental traits, but fewer negative mental traits, relative to slow or fast moving people. These effects held across both cognitive and emotional dimensions of mind and were mediated by overall positive/negative ratings of the person. These results suggest that, rather than eliciting greater perceptions of general mind, the timescale bias may reflect a generalized positivity toward average speed people relative to slow or fast moving people. PMID:24421882

  3. Ergodic averages for monotone functions using upper and lower dominating processes

    DEFF Research Database (Denmark)

    Møller, Jesper; Mengersen, Kerrie

    We show how the mean of a monotone function (defined on a state space equipped with a partial ordering) can be estimated, using ergodic averages calculated from upper and lower dominating processes of a stationary irreducible Markov chain. In particular, we do not need to simulate the stationary...... Markov chain and we eliminate the problem of whether an appropriate burn-in is determined or not. Moreover, when a central limit theorem applies, we show how confidence intervals for the mean can be estimated by bounding the asymptotic variance of the ergodic average based on the equilibrium chain. Our...... methods are studied in detail for three models using Markov chain Monte Carlo methods and we also discuss various types of other models for which our methods apply....

  4. Ergodic averages for monotone functions using upper and lower dominating processes

    DEFF Research Database (Denmark)

    Møller, Jesper; Mengersen, Kerrie

    2007-01-01

    We show how the mean of a monotone function (defined on a state space equipped with a partial ordering) can be estimated, using ergodic averages calculated from upper and lower dominating processes of a stationary irreducible Markov chain. In particular, we do not need to simulate the stationary...... Markov chain and we eliminate the problem of whether an appropriate burn-in is determined or not. Moreover, when a central limit theorem applies, we show how confidence intervals for the mean can be estimated by bounding the asymptotic variance of the ergodic average based on the equilibrium chain. Our...... methods are studied in detail for three models using Markov chain Monte Carlo methods and we also discuss various types of other models for which our methods apply....

  5. One-dimensional quantum walk with a moving boundary

    International Nuclear Information System (INIS)

    Kwek, Leong Chuan; Setiawan

    2011-01-01

    Quantum walks are interesting models with potential applications to quantum algorithms and physical processes such as photosynthesis. In this paper, we study two models of one-dimensional quantum walks, namely, quantum walks with a moving absorbing wall and quantum walks with one stationary and one moving absorbing wall. For the former, we calculate numerically the survival probability, the rate of change of average position, and the rate of change of standard deviation of the particle's position in the long time limit for different wall velocities. Moreover, we also study the asymptotic behavior and the dependence of the survival probability on the initial particle's state. While for the latter, we compute the absorption probability of the right stationary wall for different velocities and initial positions of the left wall boundary. The results for these two models are compared with those obtained for the classical model. The difference between the results obtained for the quantum and classical models can be attributed to the difference in the probability distributions.

  6. Automatic Moving Object Segmentation for Freely Moving Cameras

    Directory of Open Access Journals (Sweden)

    Yanli Wan

    2014-01-01

    Full Text Available This paper proposes a new moving object segmentation algorithm for freely moving cameras which is very common for the outdoor surveillance system, the car build-in surveillance system, and the robot navigation system. A two-layer based affine transformation model optimization method is proposed for camera compensation purpose, where the outer layer iteration is used to filter the non-background feature points, and the inner layer iteration is used to estimate a refined affine model based on the RANSAC method. Then the feature points are classified into foreground and background according to the detected motion information. A geodesic based graph cut algorithm is then employed to extract the moving foreground based on the classified features. Unlike the existing global optimization or the long term feature point tracking based method, our algorithm only performs on two successive frames to segment the moving foreground, which makes it suitable for the online video processing applications. The experiment results demonstrate the effectiveness of our algorithm in both of the high accuracy and the fast speed.

  7. Modelling and synthesis of pharmaceutical processes: moving from batch to continuous

    DEFF Research Database (Denmark)

    Papadakis, Emmanouil

    and to investigate/evaluate opportunities for continuous operation. To achieve the mentioned objectives the use of an integrated framework based on systematic model-based methods and tools is proposed. Computer-aided methods and tools are used to generate process knowledge and to evaluate different operational...... optimization studies are performed by defining optimization target based on the process analysis. The application of the developed integrated framework is highlighted through four case studies. In the first case study, the overall use of the framework is highlighted using the synthesis of ibuprofen...

  8. Structural integration of separation and reaction systems: I. Integration of stage-wise processes

    Directory of Open Access Journals (Sweden)

    Mitrović Milan

    2002-01-01

    Full Text Available The structural integration of separation processes, using multifunctional equipment, has been studied on four stage-wise liquid-liquid separations extraction, absorption, distillation, adsorption and on some combinations of these processes. It was shown for stage - wise processes that the ultimate aim of equipment integration is 3-way integration (by components by steps and by stages and that membrane multiphase contactors present concerning the equipment optimal solutions in many cases. First, by using partially integrated equipment and, later by developing fully integrated systems it was experimentally confirmed that structural 3-way integration produces much higher degrees of component separations and component enrichments in compact and safe equipment.

  9. Effects of temperature and velocity of droplet ejection process of simulated nanojets onto a moving plate's surface

    International Nuclear Information System (INIS)

    Fang, T.-H.; Chang, W.-J.; Lin, S.-L.

    2006-01-01

    This paper uses molecular dynamics simulation based on the Lennard-Jones potential to study the effects that temperature and velocity have on, the nanojet droplet ejection process, when the droplet is ejected at an angle onto a moving plate's surface. According to the analysis, it was found that the width of the spreading droplet increased as the temperature and the time were increased. Also found was an energy wave phenomenon. The contact angle of the droplet deposited on the plate decreased as the temperature was increased. Furthermore, the layer phenomena became apparent when the atoms were deposited on a moving plate. Thinner film layers were obtained as the velocity of the moving plate was increased. The contact angle on the left side of the droplet was larger than that on the right side when the plate was moving from right to left

  10. Biomass Torrefaction Process Review and Moving Bed Torrefaction System Model Development

    Energy Technology Data Exchange (ETDEWEB)

    Jaya Shakar Tumuluru; Shahab Sokhansanj; Christopher T. Wright

    2010-08-01

    Torrefaction is currently developing as an important preprocessing step to improve the quality of biomass in terms of physical properties, and proximate and ultimate composition. Torrefaction is a slow heating of biomass in an inert or reduced environment to a maximum temperature of 300°C. Torrefaction can also be defined as a group of products resulting from the partially controlled and isothermal pyrolysis of biomass occurring in a temperature range of 200–230ºC and 270–280ºC. Thus, the process can also be called a mild pyrolysis as it occurs at the lower temperature range of the pyrolysis process. At the end of the torrefaction process, a solid uniform product with lower moisture content and higher energy content than raw biomass is produced. Most of the smoke-producing compounds and other volatiles are removed during torrefaction, producing a final product that will have a lower mass but a higher heating value. An important aspect of research is to establish a degree of torrefaction where gains in heating value offset the loss of mass. There is a lack of literature on torrefaction reactor designs and a design sheet for estimating the dimensions of the torrefier based on capacity. This study includes a) conducting a detailed review on the torrefaction of biomass in terms of understanding the process, product properties, off-gas compositions, and methods used, and b) to design a moving bed torrefier, taking into account the basic fundamental heat and mass transfer calculations. Specific objectives include calculating the dimensions like diameter and height of the moving packed bed for different capacities, designing the heat loads and gas flow rates, and developing an interactive excel sheet where the user can define design specifications. In this report, 25–1000 kg/hr are used in equations for the design of the torrefier, examples of calculations, and specifications for the torrefier.

  11. On the Coplanar Integrable Case of the Twice-Averaged Hill Problem with Central Body Oblateness

    Science.gov (United States)

    Vashkov'yak, M. A.

    2018-01-01

    The twice-averaged Hill problem with the oblateness of the central planet is considered in the case where its equatorial plane coincides with the plane of its orbital motion relative to the perturbing body. A qualitative study of this so-called coplanar integrable case was begun by Y. Kozai in 1963 and continued by M.L. Lidov and M.V. Yarskaya in 1974. However, no rigorous analytical solution of the problem can be obtained due to the complexity of the integrals. In this paper we obtain some quantitative evolution characteristics and propose an approximate constructive-analytical solution of the evolution system in the form of explicit time dependences of satellite orbit elements. The methodical accuracy has been estimated for several orbits of artificial lunar satellites by comparison with the numerical solution of the evolution system.

  12. Extracting gravitational waves induced by plasma turbulence in the early Universe through an averaging process

    International Nuclear Information System (INIS)

    Garrison, David; Ramirez, Christopher

    2017-01-01

    This work is a follow-up to the paper, ‘Numerical relativity as a tool for studying the early Universe’. In this article, we determine if cosmological gravitational waves can be accurately extracted from a dynamical spacetime using an averaging process as opposed to conventional methods of gravitational wave extraction using a complex Weyl scalar. We calculate the normalized energy density, strain and degree of polarization of gravitational waves produced by a simulated turbulent plasma similar to what was believed to have existed shortly after the electroweak scale. This calculation is completed using two numerical codes, one which utilizes full general relativity calculations based on modified BSSN equations while the other utilizes a linearized approximation of general relativity. Our results show that the spectrum of gravitational waves calculated from the nonlinear code using an averaging process is nearly indistinguishable from those calculated from the linear code. This result validates the use of the averaging process for gravitational wave extraction of cosmological systems. (paper)

  13. Percentiles of the run-length distribution of the Exponentially Weighted Moving Average (EWMA) median chart

    Science.gov (United States)

    Tan, K. L.; Chong, Z. L.; Khoo, M. B. C.; Teoh, W. L.; Teh, S. Y.

    2017-09-01

    Quality control is crucial in a wide variety of fields, as it can help to satisfy customers’ needs and requirements by enhancing and improving the products and services to a superior quality level. The EWMA median chart was proposed as a useful alternative to the EWMA \\bar{X} chart because the median-type chart is robust against contamination, outliers or small deviation from the normality assumption compared to the traditional \\bar{X}-type chart. To provide a complete understanding of the run-length distribution, the percentiles of the run-length distribution should be investigated rather than depending solely on the average run length (ARL) performance measure. This is because interpretation depending on the ARL alone can be misleading, as the process mean shifts change according to the skewness and shape of the run-length distribution, varying from almost symmetric when the magnitude of the mean shift is large, to highly right-skewed when the process is in-control (IC) or slightly out-of-control (OOC). Before computing the percentiles of the run-length distribution, optimal parameters of the EWMA median chart will be obtained by minimizing the OOC ARL, while retaining the IC ARL at a desired value.

  14. Efficient processing of CFRP with a picosecond laser with up to 1.4 kW average power

    Science.gov (United States)

    Onuseit, V.; Freitag, C.; Wiedenmann, M.; Weber, R.; Negel, J.-P.; Löscher, A.; Abdou Ahmed, M.; Graf, T.

    2015-03-01

    Laser processing of carbon fiber reinforce plastic (CFRP) is a very promising method to solve a lot of the challenges for large-volume production of lightweight constructions in automotive and airplane industries. However, the laser process is actual limited by two main issues. First the quality might be reduced due to thermal damage and second the high process energy needed for sublimation of the carbon fibers requires laser sources with high average power for productive processing. To achieve thermal damage of the CFRP of less than 10μm intensities above 108 W/cm² are needed. To reach these high intensities in the processing area ultra-short pulse laser systems are favored. Unfortunately the average power of commercially available laser systems is up to now in the range of several tens to a few hundred Watt. To sublimate the carbon fibers a large volume specific enthalpy of 85 J/mm³ is necessary. This means for example that cutting of 2 mm thick material with a kerf width of 0.2 mm with industry-typical 100 mm/sec requires several kilowatts of average power. At the IFSW a thin-disk multipass amplifier yielding a maximum average output power of 1100 W (300 kHz, 8 ps, 3.7 mJ) allowed for the first time to process CFRP at this average power and pulse energy level with picosecond pulse duration. With this unique laser system cutting of CFRP with a thickness of 2 mm an effective average cutting speed of 150 mm/sec with a thermal damage below 10μm was demonstrated.

  15. Human Integration Design Processes (HIDP)

    Science.gov (United States)

    Boyer, Jennifer

    2014-01-01

    The purpose of the Human Integration Design Processes (HIDP) document is to provide human-systems integration design processes, including methodologies and best practices that NASA has used to meet human systems and human rating requirements for developing crewed spacecraft. HIDP content is framed around human-centered design methodologies and processes in support of human-system integration requirements and human rating. NASA-STD-3001, Space Flight Human-System Standard, is a two-volume set of National Aeronautics and Space Administration (NASA) Agency-level standards established by the Office of the Chief Health and Medical Officer, directed at minimizing health and performance risks for flight crews in human space flight programs. Volume 1 of NASA-STD-3001, Crew Health, sets standards for fitness for duty, space flight permissible exposure limits, permissible outcome limits, levels of medical care, medical diagnosis, intervention, treatment and care, and countermeasures. Volume 2 of NASASTD- 3001, Human Factors, Habitability, and Environmental Health, focuses on human physical and cognitive capabilities and limitations and defines standards for spacecraft (including orbiters, habitats, and suits), internal environments, facilities, payloads, and related equipment, hardware, and software with which the crew interfaces during space operations. The NASA Procedural Requirements (NPR) 8705.2B, Human-Rating Requirements for Space Systems, specifies the Agency's human-rating processes, procedures, and requirements. The HIDP was written to share NASA's knowledge of processes directed toward achieving human certification of a spacecraft through implementation of human-systems integration requirements. Although the HIDP speaks directly to implementation of NASA-STD-3001 and NPR 8705.2B requirements, the human-centered design, evaluation, and design processes described in this document can be applied to any set of human-systems requirements and are independent of reference

  16. Nano integrated circuit process

    International Nuclear Information System (INIS)

    Yoon, Yung Sup

    2004-02-01

    This book contains nine chapters, which are introduction of manufacture of semiconductor chip, oxidation such as Dry-oxidation, wet oxidation, oxidation model and oxide film, diffusion like diffusion process, diffusion equation, diffusion coefficient and diffusion system, ion implantation, including ion distribution, channeling, multiimplantation and masking and its system, sputtering such as CVD and PVD, lithography, wet etch and dry etch, interconnection and flattening like metal-silicon connection, silicide, multiple layer metal process and flattening, an integrated circuit process, including MOSFET and CMOS.

  17. Nano integrated circuit process

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, Yung Sup

    2004-02-15

    This book contains nine chapters, which are introduction of manufacture of semiconductor chip, oxidation such as Dry-oxidation, wet oxidation, oxidation model and oxide film, diffusion like diffusion process, diffusion equation, diffusion coefficient and diffusion system, ion implantation, including ion distribution, channeling, multiimplantation and masking and its system, sputtering such as CVD and PVD, lithography, wet etch and dry etch, interconnection and flattening like metal-silicon connection, silicide, multiple layer metal process and flattening, an integrated circuit process, including MOSFET and CMOS.

  18. Gibbs equilibrium averages and Bogolyubov measure

    International Nuclear Information System (INIS)

    Sankovich, D.P.

    2011-01-01

    Application of the functional integration methods in equilibrium statistical mechanics of quantum Bose-systems is considered. We show that Gibbs equilibrium averages of Bose-operators can be represented as path integrals over a special Gauss measure defined in the corresponding space of continuous functions. We consider some problems related to integration with respect to this measure

  19. Robust iterative learning control for multi-phase batch processes: an average dwell-time method with 2D convergence indexes

    Science.gov (United States)

    Wang, Limin; Shen, Yiteng; Yu, Jingxian; Li, Ping; Zhang, Ridong; Gao, Furong

    2018-01-01

    In order to cope with system disturbances in multi-phase batch processes with different dimensions, a hybrid robust control scheme of iterative learning control combined with feedback control is proposed in this paper. First, with a hybrid iterative learning control law designed by introducing the state error, the tracking error and the extended information, the multi-phase batch process is converted into a two-dimensional Fornasini-Marchesini (2D-FM) switched system with different dimensions. Second, a switching signal is designed using the average dwell-time method integrated with the related switching conditions to give sufficient conditions ensuring stable running for the system. Finally, the minimum running time of the subsystems and the control law gains are calculated by solving the linear matrix inequalities. Meanwhile, a compound 2D controller with robust performance is obtained, which includes a robust extended feedback control for ensuring the steady-state tracking error to converge rapidly. The application on an injection molding process displays the effectiveness and superiority of the proposed strategy.

  20. Quasi-Stationary Temperature Field of Two-Layer Half-Space with Moving Boundary

    Directory of Open Access Journals (Sweden)

    P. A. Vlasov

    2015-01-01

    Full Text Available Due to intensive introduction of mathematical modeling methods into engineering practice, analytical methods for solving problems of heat conduction theory along with computational methods become increasingly important. Despite the well-known limitations of the analytical method applicability, this trend is caused by many reasons. In particular, solutions of the appropriate problems presented in analytically closed form can be used to test the new efficient computational algorithms, to carry out a parametric study of the temperature field of the analyzed system and to explore specific features of its formation, to formulate and solve optimization problems. In addition, these solutions allow us to explore the possibility for simplifying mathematical model with retaining its adequacy to the studied process.The main goal of the conducted research is to provide an analytically closed-form solution to the problem of finding the quasi-stationary temperature field of the system, which is simulated by isotropic half-space with isotropic coating of constant thickness. The outer boundary of this system is exposed to the Gaussian-type heat flux and uniformly moves in parallel with itself.A two-dimensional mathematical model that takes into account the axial symmetry of the studied process has been used. After the transition to a moving coordinate system rigidly associated with a moving boundary the Hankel integral transform of zero order (with respect to the radial variable and the Laplace transform (with respect to the temporal variable were used. Next, the image of the Hankel transform for the stationary temperature field of the system with respect to the moving coordinate system was found using a limit theorem of operational calculus. This allowed representing the required quasi-stationary field in the form of an improper integral of the first kind, which depends on the parameters. This result obtained can be used to conduct a parametric study and solve

  1. Integrated control system for electron beam processes

    Science.gov (United States)

    Koleva, L.; Koleva, E.; Batchkova, I.; Mladenov, G.

    2018-03-01

    The ISO/IEC 62264 standard is widely used for integration of the business systems of a manufacturer with the corresponding manufacturing control systems based on hierarchical equipment models, functional data and manufacturing operations activity models. In order to achieve the integration of control systems, formal object communication models must be developed, together with manufacturing operations activity models, which coordinate the integration between different levels of control. In this article, the development of integrated control system for electron beam welding process is presented as part of a fully integrated control system of an electron beam plant, including also other additional processes: surface modification, electron beam evaporation, selective melting and electron beam diagnostics.

  2. Improved averaging for non-null interferometry

    Science.gov (United States)

    Fleig, Jon F.; Murphy, Paul E.

    2013-09-01

    Arithmetic averaging of interferometric phase measurements is a well-established method for reducing the effects of time varying disturbances, such as air turbulence and vibration. Calculating a map of the standard deviation for each pixel in the average map can provide a useful estimate of its variability. However, phase maps of complex and/or high density fringe fields frequently contain defects that severely impair the effectiveness of simple phase averaging and bias the variability estimate. These defects include large or small-area phase unwrapping artifacts, large alignment components, and voids that change in number, location, or size. Inclusion of a single phase map with a large area defect into the average is usually sufficient to spoil the entire result. Small-area phase unwrapping and void defects may not render the average map metrologically useless, but they pessimistically bias the variance estimate for the overwhelming majority of the data. We present an algorithm that obtains phase average and variance estimates that are robust against both large and small-area phase defects. It identifies and rejects phase maps containing large area voids or unwrapping artifacts. It also identifies and prunes the unreliable areas of otherwise useful phase maps, and removes the effect of alignment drift from the variance estimate. The algorithm has several run-time adjustable parameters to adjust the rejection criteria for bad data. However, a single nominal setting has been effective over a wide range of conditions. This enhanced averaging algorithm can be efficiently integrated with the phase map acquisition process to minimize the number of phase samples required to approach the practical noise floor of the metrology environment.

  3. Process integration of organic Rankine cycle

    International Nuclear Information System (INIS)

    Desai, Nishith B.; Bandyopadhyay, Santanu

    2009-01-01

    An organic Rankine cycle (ORC) uses an organic fluid as a working medium within a Rankine cycle power plant. ORC offers advantages over conventional Rankine cycle with water as the working medium, as ORC generates shaft-work from low to medium temperature heat sources with higher thermodynamic efficiency. The dry and the isentropic fluids are most preferred working fluid for the ORC. The basic ORC can be modified by incorporating both regeneration and turbine bleeding to improve its thermal efficiency. In this paper, 16 different organic fluids have been analyzed as a working medium for the basic as well as modified ORCs. A methodology is also proposed for appropriate integration and optimization of an ORC as a cogeneration process with the background process to generate shaft-work. It has been illustrated that the choice of cycle configuration for appropriate integration with the background process depends on the heat rejection profile of the background process (i.e., the shape of the below pinch portion of the process grand composite curve). The benefits of integrating ORC with the background process and the applicability of the proposed methodology have been demonstrated through illustrative examples.

  4. Occupational injuries and sick leaves in household moving works.

    Science.gov (United States)

    Hwan Park, Myoung; Jeong, Byung Yong

    2017-09-01

    This study is concerned with household moving works and the characteristics of occupational injuries and sick leaves in each step of the moving process. Accident data for 392 occupational accidents were categorized by the moving processes in which the accidents occurred, and possible incidents and sick leaves were assessed for each moving process and hazard factor. Accidents occurring during specific moving processes showed different characteristics depending on the type of accident and agency of accidents. The most critical form in the level of risk management was falls from a height in the 'lifting by ladder truck' process. Incidents ranked as a 'High' level of risk management were in the forms of slips, being struck by objects and musculoskeletal disorders in the 'manual materials handling' process. Also, falls in 'loading/unloading', being struck by objects during 'lifting by ladder truck' and driving accidents in the process of 'transport' were ranked 'High'. The findings of this study can be used to develop more effective accident prevention policy reflecting different circumstances and conditions to reduce occupational accidents in household moving works.

  5. Ensemble averaged coherent state path integral for disordered bosons with a repulsive interaction (Derivation of mean field equations)

    International Nuclear Information System (INIS)

    Mieck, B.

    2007-01-01

    We consider bosonic atoms with a repulsive contact interaction in a trap potential for a Bose-Einstein condensation (BEC) and additionally include a random potential. The ensemble averages for two models of static (I) and dynamic (II) disorder are performed and investigated in parallel. The bosonic many body systems of the two disorder models are represented by coherent state path integrals on the Keldysh time contour which allow exact ensemble averages for zero and finite temperatures. These ensemble averages of coherent state path integrals therefore present alternatives to replica field theories or super-symmetric averaging techniques. Hubbard-Stratonovich transformations (HST) lead to two corresponding self-energies for the hermitian repulsive interaction and for the non-hermitian disorder-interaction. The self-energy of the repulsive interaction is absorbed by a shift into the disorder-self-energy which comprises as an element of a larger symplectic Lie algebra sp(4M) the self-energy of the repulsive interaction as a subalgebra (which is equivalent to the direct product of M x sp(2); 'M' is the number of discrete time intervals of the disorder-self-energy in the generating function). After removal of the remaining Gaussian integral for the self-energy of the repulsive interaction, the first order variations of the coherent state path integrals result in the exact mean field or saddle point equations, solely depending on the disorder-self-energy matrix. These equations can be solved by continued fractions and are reminiscent to the 'Nambu-Gorkov' Green function formalism in superconductivity because anomalous terms or pair condensates of the bosonic atoms are also included into the selfenergies. The derived mean field equations of the models with static (I) and dynamic (II) disorder are particularly applicable for BEC in d=3 spatial dimensions because of the singularity of the density of states at vanishing wavevector. However, one usually starts out from

  6. Improved Multiscale Entropy Technique with Nearest-Neighbor Moving-Average Kernel for Nonlinear and Nonstationary Short-Time Biomedical Signal Analysis

    Directory of Open Access Journals (Sweden)

    S. P. Arunachalam

    2018-01-01

    Full Text Available Analysis of biomedical signals can yield invaluable information for prognosis, diagnosis, therapy evaluation, risk assessment, and disease prevention which is often recorded as short time series data that challenges existing complexity classification algorithms such as Shannon entropy (SE and other techniques. The purpose of this study was to improve previously developed multiscale entropy (MSE technique by incorporating nearest-neighbor moving-average kernel, which can be used for analysis of nonlinear and non-stationary short time series physiological data. The approach was tested for robustness with respect to noise analysis using simulated sinusoidal and ECG waveforms. Feasibility of MSE to discriminate between normal sinus rhythm (NSR and atrial fibrillation (AF was tested on a single-lead ECG. In addition, the MSE algorithm was applied to identify pivot points of rotors that were induced in ex vivo isolated rabbit hearts. The improved MSE technique robustly estimated the complexity of the signal compared to that of SE with various noises, discriminated NSR and AF on single-lead ECG, and precisely identified the pivot points of ex vivo rotors by providing better contrast between the rotor core and the peripheral region. The improved MSE technique can provide efficient complexity analysis of variety of nonlinear and nonstationary short-time biomedical signals.

  7. MODELLING THE INTERACTION IN GAME SPORTS - RELATIVE PHASE AND MOVING CORRELATIONS

    Directory of Open Access Journals (Sweden)

    Martin Lames

    2006-12-01

    Full Text Available Model building in game sports should maintain the constitutive feature of this group of sports, the dynamic interaction process between the two parties. For single net/wall games relative phase is suggested to describe the positional interaction between the two players. 30 baseline rallies in tennis were examined and relative phase was calculated by Hilbert transform from the two time-series of lateral displacement and trajectory in the court respectively. Results showed that relative phase indicates some aspects of the tactical interaction in tennis. At a more abstract level the interaction between two teams in handball was studied by examining the relationship of the two scoring processes. Each process can be conceived as a random walk. Moving averages of the scoring probabilities indicate something like a momentary strength. A moving correlation (length = 20 ball possessions describes the momentary relationship between the teams' strength. Evidence was found that this correlation is heavily time-dependent, in almost every single game among the 40 examined ones we found phases with a significant positive as well as significant negative relationship. This underlines the importance of a dynamic view on the interaction in these games.

  8. Move-by-move dynamics of the advantage in chess matches reveals population-level learning of the game.

    Directory of Open Access Journals (Sweden)

    Haroldo V Ribeiro

    Full Text Available The complexity of chess matches has attracted broad interest since its invention. This complexity and the availability of large number of recorded matches make chess an ideal model systems for the study of population-level learning of a complex system. We systematically investigate the move-by-move dynamics of the white player's advantage from over seventy thousand high level chess matches spanning over 150 years. We find that the average advantage of the white player is positive and that it has been increasing over time. Currently, the average advantage of the white player is 0.17 pawns but it is exponentially approaching a value of 0.23 pawns with a characteristic time scale of 67 years. We also study the diffusion of the move dependence of the white player's advantage and find that it is non-Gaussian, has long-ranged anti-correlations and that after an initial period with no diffusion it becomes super-diffusive. We find that the duration of the non-diffusive period, corresponding to the opening stage of a match, is increasing in length and exponentially approaching a value of 15.6 moves with a characteristic time scale of 130 years. We interpret these two trends as a resulting from learning of the features of the game. Additionally, we find that the exponent [Formula: see text] characterizing the super-diffusive regime is increasing toward a value of 1.9, close to the ballistic regime. We suggest that this trend is due to the increased broadening of the range of abilities of chess players participating in major tournaments.

  9. Move-by-move dynamics of the advantage in chess matches reveals population-level learning of the game.

    Science.gov (United States)

    Ribeiro, Haroldo V; Mendes, Renio S; Lenzi, Ervin K; del Castillo-Mussot, Marcelo; Amaral, Luís A N

    2013-01-01

    The complexity of chess matches has attracted broad interest since its invention. This complexity and the availability of large number of recorded matches make chess an ideal model systems for the study of population-level learning of a complex system. We systematically investigate the move-by-move dynamics of the white player's advantage from over seventy thousand high level chess matches spanning over 150 years. We find that the average advantage of the white player is positive and that it has been increasing over time. Currently, the average advantage of the white player is 0.17 pawns but it is exponentially approaching a value of 0.23 pawns with a characteristic time scale of 67 years. We also study the diffusion of the move dependence of the white player's advantage and find that it is non-Gaussian, has long-ranged anti-correlations and that after an initial period with no diffusion it becomes super-diffusive. We find that the duration of the non-diffusive period, corresponding to the opening stage of a match, is increasing in length and exponentially approaching a value of 15.6 moves with a characteristic time scale of 130 years. We interpret these two trends as a resulting from learning of the features of the game. Additionally, we find that the exponent [Formula: see text] characterizing the super-diffusive regime is increasing toward a value of 1.9, close to the ballistic regime. We suggest that this trend is due to the increased broadening of the range of abilities of chess players participating in major tournaments.

  10. CIPSS [computer-integrated process and safeguards system]: The integration of computer-integrated manufacturing and robotics with safeguards, security, and process operations

    International Nuclear Information System (INIS)

    Leonard, R.S.; Evans, J.C.

    1987-01-01

    This poster session describes the computer-integrated process and safeguards system (CIPSS). The CIPSS combines systems developed for factory automation and automated mechanical functions (robots) with varying degrees of intelligence (expert systems) to create an integrated system that would satisfy current and emerging security and safeguards requirements. Specifically, CIPSS is an extension of the automated physical security functions concepts. The CIPSS also incorporates the concepts of computer-integrated manufacturing (CIM) with integrated safeguards concepts, and draws upon the Defense Advance Research Project Agency's (DARPA's) strategic computing program

  11. Integration thermal processes through Pinch technology

    International Nuclear Information System (INIS)

    Rios H, Carlos Mario; Grisales Rincon, Rogelio; Cardona, Carlos Ariel

    2004-01-01

    This paper presents the techniques of heat integration used for process optimization, their fortresses and weaknesses during the implementation in several specific process are also discussed. It is focused to the pinch technology, explaining algorithms for method applications in the industry. The paper provides the concepts and models involved in different types of commercial software applying this method for energy cost reduction, both in design of new plants and improve of old ones. As complement to benefits of the energy cost reduction it is analysed other favorable aspects of process integration, as the emissions waste reduction and the combined heat end power systems

  12. A DIALECTICAL PERSPECTIVE OF TRAUMA PROCESSING

    Directory of Open Access Journals (Sweden)

    Brurit Laub

    2014-03-01

    Full Text Available This article presents a dialectical perspective, which attempts to elucidate the integrative components of trauma processing in therapy. It is proposed that the inherent movement toward greater integration is an expanding dialectical movement. It is conceived as a spiral resulting from the synergy of two dialectical movements. The horizontal line moves between the opposite aspects of the individual (thesis vs. antithesis toward a synthesis. The vertical line moves upward via whole/part shifts toward greater integration, or downward toward disintegration and fragmentation. It is proposed that the complementary processes of differentiation and linking are the building blocks of the integrative/dialectical movement. Differentiation relates to the separation of parts and linking relates to their connection. The role of differentiation and linking in three basic interacting systems of trauma work is discussed. It is proposed that the dialectical principles are applicable to various therapeutic approaches and clinical vignettes are included to illustrate.

  13. Variance of discharge estimates sampled using acoustic Doppler current profilers from moving boats

    Science.gov (United States)

    Garcia, Carlos M.; Tarrab, Leticia; Oberg, Kevin; Szupiany, Ricardo; Cantero, Mariano I.

    2012-01-01

    This paper presents a model for quantifying the random errors (i.e., variance) of acoustic Doppler current profiler (ADCP) discharge measurements from moving boats for different sampling times. The model focuses on the random processes in the sampled flow field and has been developed using statistical methods currently available for uncertainty analysis of velocity time series. Analysis of field data collected using ADCP from moving boats from three natural rivers of varying sizes and flow conditions shows that, even though the estimate of the integral time scale of the actual turbulent flow field is larger than the sampling interval, the integral time scale of the sampled flow field is on the order of the sampling interval. Thus, an equation for computing the variance error in discharge measurements associated with different sampling times, assuming uncorrelated flow fields is appropriate. The approach is used to help define optimal sampling strategies by choosing the exposure time required for ADCPs to accurately measure flow discharge.

  14. Current control by a homopolar machine with moving brushes

    International Nuclear Information System (INIS)

    Vogel, H.

    1978-01-01

    The equation for TNS Doublet's E-coil circuit with moving brush homopolar machine is integrated in the flux of the homopolar for a monotonically increasing current function extending beyond the current reversal into the burn period. The results show that the moving brush feature is not useful for controlling the burn

  15. Multi-pulse orbits and chaotic dynamics in motion of parametrically excited viscoelastic moving belt

    International Nuclear Information System (INIS)

    Zhang Wei; Yao Minghui

    2006-01-01

    In this paper, the Shilnikov type multi-pulse orbits and chaotic dynamics of parametrically excited viscoelastic moving belt are studied in detail. Using Kelvin-type viscoelastic constitutive law, the equations of motion for viscoelastic moving belt with the external damping and parametric excitation are given. The four-dimensional averaged equation under the case of primary parametric resonance is obtained by directly using the method of multiple scales and Galerkin's approach to the partial differential governing equation of viscoelastic moving belt. From the averaged equations obtained here, the theory of normal form is used to give the explicit expressions of normal form with a double zero and a pair of pure imaginary eigenvalues. Based on normal form, the energy-phrase method is employed to analyze the global bifurcations and chaotic dynamics in parametrically excited viscoelastic moving belt. The global bifurcation analysis indicates that there exist the heteroclinic bifurcations and the Silnikov type multi-pulse homoclinic orbits in the averaged equation. The results obtained above mean the existence of the chaos for the Smale horseshoe sense in parametrically excited viscoelastic moving belt. The chaotic motions of viscoelastic moving belts are also found by using numerical simulation. A new phenomenon on the multi-pulse jumping orbits is observed from three-dimensional phase space

  16. Implementation of the SMART MOVE intervention in primary care: a qualitative study using normalisation process theory.

    Science.gov (United States)

    Glynn, Liam G; Glynn, Fergus; Casey, Monica; Wilkinson, Louise Gaffney; Hayes, Patrick S; Heaney, David; Murphy, Andrew W M

    2018-05-02

    Problematic translational gaps continue to exist between demonstrating the positive impact of healthcare interventions in research settings and their implementation into routine daily practice. The aim of this qualitative evaluation of the SMART MOVE trial was to conduct a theoretically informed analysis, using normalisation process theory, of the potential barriers and levers to the implementation of a mhealth intervention to promote physical activity in primary care. The study took place in the West of Ireland with recruitment in the community from the Clare Primary Care Network. SMART MOVE trial participants and the staff from four primary care centres were invited to take part and all agreed to do so. A qualitative methodology with a combination of focus groups (general practitioners, practice nurses and non-clinical staff from four separate primary care centres, n = 14) and individual semi-structured interviews (intervention and control SMART MOVE trial participants, n = 4) with purposeful sampling utilising the principles of Framework Analysis was utilised. The Normalisation Process Theory was used to develop the topic guide for the interviews and also informed the data analysis process. Four themes emerged from the analysis: personal and professional exercise strategies; roles and responsibilities to support active engagement; utilisation challenges; and evaluation, adoption and adherence. It was evident that introducing a new healthcare intervention demands a comprehensive evaluation of the intervention itself and also the environment in which it is to operate. Despite certain obstacles, the opportunity exists for the successful implementation of a novel healthcare intervention that addresses a hitherto unresolved healthcare need, provided that the intervention has strong usability attributes for both disseminators and target users and coheres strongly with the core objectives and culture of the health care environment in which it is to operate. We

  17. Integrating ergonomic knowledge into engineering design processes

    DEFF Research Database (Denmark)

    Hall-Andersen, Lene Bjerg

    Integrating ergonomic knowledge into engineering design processes has been shown to contribute to healthy and effective designs of workplaces. However, it is also well-recognized that, in practice, ergonomists often have difficulties gaining access to and impacting engineering design processes...... employed in the same company, constituted a supporting factor for the possibilities to integrate ergonomic knowledge into the engineering design processes. However, the integration activities remained discrete and only happened in some of the design projects. A major barrier was related to the business...... to the ergonomic ambitions of the clients. The ergonomists’ ability to navigate, act strategically, and compromise on ergonomic inputs is also important in relation to having an impact in the engineering design processes. Familiarity with the engineering design terminology and the setup of design projects seems...

  18. Sustaining high energy efficiency in existing processes with advanced process integration technology

    International Nuclear Information System (INIS)

    Zhang, Nan; Smith, Robin; Bulatov, Igor; Klemeš, Jiří Jaromír

    2013-01-01

    Highlights: ► Process integration with better modelling and more advanced solution methods. ► Operational changes for better environmental performance through optimisation. ► Identification of process integration technology for operational optimisation. ► Systematic implementation procedure of process integration technology. ► A case study with crude oil distillation to demonstrate the operational flexibility. -- Abstract: To reduce emissions in the process industry, much emphasis has been put on making step changes in emission reduction, by developing new process technology and making renewable energy more affordable. However, the energy saving potential of existing systems cannot be simply ignored. In recent years, there have been significant advances in process integration technology with better modelling techniques and more advanced solution methods. These methods have been applied to the new design and retrofit studies in the process industry. Here attempts are made to apply these technologies to improve the environmental performance of existing facilities with operational changes. An industrial project was carried out to demonstrate the importance and effectiveness of exploiting the operational flexibility for energy conservation. By applying advanced optimisation technique to integrate the operation of distillation and heat recovery in a crude oil distillation unit, the energy consumption was reduced by 8% without capital expenditure. It shows that with correctly identified technology and the proper execution procedure, significant energy savings and emission reduction can be achieved very quickly without major capital expenditure. This allows the industry to improve its economic and environment performance at the same time.

  19. Moving Horizon Estimation and Control

    DEFF Research Database (Denmark)

    Jørgensen, John Bagterp

    successful and applied methodology beyond PID-control for control of industrial processes. The main contribution of this thesis is introduction and definition of the extended linear quadratic optimal control problem for solution of numerical problems arising in moving horizon estimation and control...... problems. Chapter 1 motivates moving horizon estimation and control as a paradigm for control of industrial processes. It introduces the extended linear quadratic control problem and discusses its central role in moving horizon estimation and control. Introduction, application and efficient solution....... It provides an algorithm for computation of the maximal output admissible set for linear model predictive control. Appendix D provides results concerning linear regression. Appendix E discuss prediction error methods for identification of linear models tailored for model predictive control....

  20. The Effect of Direction on Cursor Moving Kinematics

    Directory of Open Access Journals (Sweden)

    Chiu-Ping Lu

    2012-02-01

    Full Text Available There have been only few studies to substantiate the kinematic characteristics of cursor movement. In this study, a quantitative experimental research method was used to explore the effect of moving direction on the kinematics of cursor movement in 24 typical young persons using our previously developed computerized measuring program. The results of multiple one way repeated measures ANOVAs and post hoc LSD tests demonstrated that the moving direction had effects on average velocity, movement time, movement unit and peak velocity. Moving leftward showed better efficiency than moving rightward, upward and downward from the kinematic evidences such as velocity, movement unit and time. Moreover, the unique pattern of the power spectral density (PSD of velocity (strategy for power application explained why the smoothness was still maintained while moving leftward even under an unstable situation with larger momentum. Moreover, the information from this cursor moving study can guide us to relocate the toolbars and icons in the window interface, especially for individuals with physical disabilities whose performances are easily interrupted while controlling the cursor in specific directions.

  1. Transport of the moving barrier driven by chiral active particles

    Science.gov (United States)

    Liao, Jing-jing; Huang, Xiao-qun; Ai, Bao-quan

    2018-03-01

    Transport of a moving V-shaped barrier exposed to a bath of chiral active particles is investigated in a two-dimensional channel. Due to the chirality of active particles and the transversal asymmetry of the barrier position, active particles can power and steer the directed transport of the barrier in the longitudinal direction. The transport of the barrier is determined by the chirality of active particles. The moving barrier and active particles move in the opposite directions. The average velocity of the barrier is much larger than that of active particles. There exist optimal parameters (the chirality, the self-propulsion speed, the packing fraction, and the channel width) at which the average velocity of the barrier takes its maximal value. In particular, tailoring the geometry of the barrier and the active concentration provides novel strategies to control the transport properties of micro-objects or cargoes in an active medium.

  2. Modeling and simulation of dust behaviors behind a moving vehicle

    Science.gov (United States)

    Wang, Jingfang

    behaviors. In addition, I introduce a temporal smoothing technique to eliminate the jagged effect caused by large simulation time. Several algorithms are used to speed up the simulation. For example, pre-calculated tables and display lists are created to replace some of the most commonly used functions, scripts and processes. The performance study shows that both time and space costs of the algorithms are linear in the number of particles in the system. On a Silicon Graphics Octane, three vehicles with 20,000 particles run at 6-8 frames per second on average. This speed does not include the extra calculations of convergence of the numerical integration for fluid dynamics which usually takes about 4-5 minutes to achieve steady state.

  3. CPICOR{trademark}: Clean power from integrated coal-ore reduction

    Energy Technology Data Exchange (ETDEWEB)

    Wintrell, R.; Miller, R.N.; Harbison, E.J.; LeFevre, M.O.; England, K.S.

    1997-12-31

    The US steel industry, in order to maintain its basic iron production, is thus moving to lower coke requirements and to the cokeless or direct production of iron. The US Department of Energy (DOE), in its Clean Coal Technology programs, has encouraged the move to new coal-based technology. The steel industry, in its search for alternative direct iron processes, has been limited to a single process, COREX{reg_sign}. The COREX{reg_sign} process, though offering commercial and environmental acceptance, produces a copious volume of offgas which must be effectively utilized to ensure an economical process. This volume, which normally exceeds the internal needs of a single steel company, offers a highly acceptable fuel for power generation. The utility companies seeking to offset future natural gas cost increases are interested in this clean fuel. The COREX{reg_sign} smelting process, when integrated with a combined cycle power generation facility (CCPG) and a cryogenic air separation unit (ASU), is an outstanding example of a new generation of environmentally compatible and highly energy efficient Clean Coal Technologies. This combination of highly integrated electric power and hot metal coproduction, has been designated CPICOR{trademark}, Clean Power from Integrated Coal/Ore Reduction.

  4. Materials issues in silicon integrated circuit processing

    International Nuclear Information System (INIS)

    Wittmer, M.; Stimmell, J.; Strathman, M.

    1986-01-01

    The symposium on ''Materials Issues in Integrated Circuit Processing'' sought to bring together all of the materials issued pertinent to modern integrated circuit processing. The inherent properties of the materials are becoming an important concern in integrated circuit manufacturing and accordingly research in materials science is vital for the successful implementation of modern integrated circuit technology. The session on Silicon Materials Science revealed the advanced stage of knowledge which topics such as point defects, intrinsic and extrinsic gettering and diffusion kinetics have achieved. Adaption of this knowledge to specific integrated circuit processing technologies is beginning to be addressed. The session on Epitaxy included invited papers on epitaxial insulators and IR detectors. Heteroepitaxy on silicon is receiving great attention and the results presented in this session suggest that 3-d integrated structures are an increasingly realistic possibility. Progress in low temperature silicon epitaxy and epitaxy of thin films with abrupt interfaces was also reported. Diffusion and Ion Implantation were well presented. Regrowth of implant-damaged layers and the nature of the defects which remain after regrowth were discussed in no less than seven papers. Substantial progress was also reported in the understanding of amorphising boron implants and the use of gallium implants for the formation of shallow p/sup +/ -layers

  5. THE INTEGRATION PROCESS MERCOSUR IN 2007 BY MODEL OF GLOBAL DIMENSION OF REGIONAL INTEGRATION

    Directory of Open Access Journals (Sweden)

    André Bechlin

    2013-04-01

    Full Text Available This paper aimed to analyze the advance of the regional integration process in the MERCOSUR (Southern Common Market, using a model developed for Professor Mario Ruiz Estrada, of the College of Economy and Administration of the University of Kuala Lumpur in Malaysia, the GDRI (Global Dimension of Regional Integration Model and that as characteristic has differentiated the use of other variable for analysis, that not specifically of economic origin, derivatives of the evolution of the commerce processes. When inferring and comparing the external performance of the economies that compose the Mercosur, evaluating itself the impacts of the advance of the process of regional and commercial integration, are evidents the inequalities that exist in the block. However, a common evolution is observed, in the direction of intensification of the integration between the economies, mainly after the process of opening lived for the continent, beyond the advance of the integration in the context of the Mercosur, from the decade of 1990. The analyzed data show that, in the generality, these economies are if integrating to the world-wide market, and in parallel, accenting the integration degree enters the members of the block.

  6. Integrated biofuels process synthesis

    DEFF Research Database (Denmark)

    Torres-Ortega, Carlo Edgar; Rong, Ben-Guang

    2017-01-01

    Second and third generation bioethanol and biodiesel are more environmentally friendly fuels than gasoline and petrodiesel, andmore sustainable than first generation biofuels. However, their production processes are more complex and more expensive. In this chapter, we describe a two-stage synthesis......% used for bioethanol process), and steam and electricity from combustion (54%used as electricity) in the bioethanol and biodiesel processes. In the second stage, we saved about 5% in equipment costs and 12% in utility costs for bioethanol separation. This dual synthesis methodology, consisting of a top......-level screening task followed by a down-level intensification task, proved to be an efficient methodology for integrated biofuel process synthesis. The case study illustrates and provides important insights into the optimal synthesis and intensification of biofuel production processes with the proposed synthesis...

  7. Teaching Process Design through Integrated Process Synthesis

    Science.gov (United States)

    Metzger, Matthew J.; Glasser, Benjamin J.; Patel, Bilal; Hildebrandt, Diane; Glasser, David

    2012-01-01

    The design course is an integral part of chemical engineering education. A novel approach to the design course was recently introduced at the University of the Witwatersrand, Johannesburg, South Africa. The course aimed to introduce students to systematic tools and techniques for setting and evaluating performance targets for processes, as well as…

  8. An analytical boundary element integral approach to track the boundary of a moving cavity using electrical impedance tomography

    International Nuclear Information System (INIS)

    Khambampati, Anil Kumar; Kim, Sin; Lee, Bo An; Kim, Kyung Youn

    2012-01-01

    This paper is about locating the boundary of a moving cavity within a homogeneous background from the voltage measurements recorded on the outer boundary. An inverse boundary problem of a moving cavity is formulated by considering a two-phase vapor–liquid flow in a pipe. The conductivity of the flow components (vapor and liquid) is assumed to be constant and known a priori while the location and shape of the inclusion (vapor) are the unknowns to be estimated. The forward problem is solved using the boundary element method (BEM) with the integral equations solved analytically. A special situation is considered such that the cavity changes its location and shape during the time taken to acquire a full set of independent measurement data. The boundary of a cavity is assumed to be elliptic and is parameterized with Fourier series. The inverse problem is treated as a state estimation problem with the Fourier coefficients that represent the center and radii of the cavity as the unknowns to be estimated. An extended Kalman filter (EKF) is used as an inverse algorithm to estimate the time varying Fourier coefficients. Numerical experiments are shown to evaluate the performance of the proposed method. Through the results, it can be noticed that the proposed BEM with EKF method is successful in estimating the boundary of a moving cavity. (paper)

  9. Integrating digital topology in image-processing libraries.

    Science.gov (United States)

    Lamy, Julien

    2007-01-01

    This paper describes a method to integrate digital topology informations in image-processing libraries. This additional information allows a library user to write algorithms respecting topological constraints, for example, a seed fill or a skeletonization algorithm. As digital topology is absent from most image-processing libraries, such constraints cannot be fulfilled. We describe and give code samples for all the structures necessary for this integration, and show a use case in the form of a homotopic thinning filter inside ITK. The obtained filter can be up to a hundred times as fast as ITK's thinning filter and works for any image dimension. This paper mainly deals of integration within ITK, but can be adapted with only minor modifications to other image-processing libraries.

  10. Counting on the mental number line to make a move: Sensorimotor ('pen') control and numerical processing

    NARCIS (Netherlands)

    Sheridan, R.; Rooijen, M. van; Giles, O.; Mushtaq, F.; Steenbergen, B.; Mon-Williams, M.; Waterman, A.H.

    2017-01-01

    Mathematics is often conducted with a writing implement. But is there a relationship between numerical processing and sensorimotor 'pen' control? We asked participants to move a stylus so it crossed an unmarked line at a location specified by a symbolic number (1-9), where number colour indicated

  11. Audiovisual integration in speech perception: a multi-stage process

    DEFF Research Database (Denmark)

    Eskelund, Kasper; Tuomainen, Jyrki; Andersen, Tobias

    2011-01-01

    investigate whether the integration of auditory and visual speech observed in these two audiovisual integration effects are specific traits of speech perception. We further ask whether audiovisual integration is undertaken in a single processing stage or multiple processing stages....

  12. Apparatus and process for continuous measurement of moisture in moving coal by neutron thermalization

    International Nuclear Information System (INIS)

    Stewart, R.F.

    1967-01-01

    The invention relates to an apparatus and process for the measurement of moisture contents in solid materials. More particularly, the invention makes available a continuous moisture analysis of a moving mass of material, such as coal, by penetrating such material with neutrons emitted from a source of fast neutrons and detecting, counting, and recording slowed or thermalized neutrons reflected from the internal structure of the material. (U.S.)

  13. P1-25: Filling-in the Blind Spot with the Average Direction

    Directory of Open Access Journals (Sweden)

    Sang-Ah Yoo

    2012-10-01

    Full Text Available Previous studies have shown that the visual system integrates local motions and perceives the average direction (Watamaniuk & Duchon, 1992 Vision Research 32 931–941. We investigated whether the surface of the blind spot is filled in with the average direction of the surrounding local motions. To test this, we varied the direction of a random-dot kinematogram (RDK both in adaptation and test. Motion aftereffects (MAE were defined as the difference of motion coherence thresholds between with and without adaptation. The participants were initially adapted to an annular RDK surrounding the blind spot for 30 s in their dominant eyes. The direction of each dot in this RDK was selected equally and randomly from either a normal distribution with the mean of 15° clockwise from vertical, 15° counterclockwise from vertical, or from the mixture of them. Immediately after the adaptation, a disk-shaped test RDK was presented for 1 s to the corresponding blind-spot location in the opposite eye. This RDK moved either 15° clockwise, 15° counterclockwise, or vertically (the average of the two directions. The participants' task was to discriminate the direction of the test RDK across different coherence levels. We found significant MAE when the test RDK had the same directions as the adaptor. More importantly, equally strong MAE was observed even when the direction of the test RDK was vertical, which was not physically present during adaptation. The result demonstrates that the visual system uses the average direction of the local surrounding motions to fill in the blind spot.

  14. Experimental Verification of a Vehicle Localization based on Moving Horizon Estimation Integrating LRS and Odometry

    International Nuclear Information System (INIS)

    Sakaeta, Kuniyuki; Nonaka, Kenichiro; Sekiguchi, Kazuma

    2016-01-01

    Localization is an important function for the robots to complete various tasks. For localization, both internal and external sensors are used generally. The odometry is widely used as the method based on the internal sensors, but it suffers from cumulative errors. In the method using the laser range sensor (LRS) which is a kind of external sensor, the estimation accuracy is affected by the number of available measurement data. In our previous study, we applied moving horizon estimation (MHE) to the vehicle localization for integrating the LRS measurement data and the odometry information where the weightings of them are balanced relatively adapting to the number of the available LRS measurement data. In this paper, the effectiveness of the proposed localization method is verified through both numerical simulations and experiments using a 1/10 scale vehicle. The verification is conducted in the situations where the vehicle position cannot be localized uniquely on a certain direction using the LRS measurement data only. We achieve accurate localization even in such a situation by integrating the odometry and LRS based on MHE. We also show the superiority of the method through comparisons with a method using extended Kalman filter (EKF). (paper)

  15. Exponential Smoothing, Long Memory and Volatility Prediction

    DEFF Research Database (Denmark)

    Proietti, Tommaso

    three models that are natural extensions of ES: the fractionally integrated first order moving average (FIMA) model, a new integrated moving average model formulated in terms of the fractional lag operator (FLagIMA), and a fractional equal root integrated moving average (FerIMA) model, proposed...... originally by Hosking. We investigate the properties of the volatility components and the forecasts arising from these specification, which depend uniquely on the memory and the moving average parameters. For statistical inference we show that, under mild regularity conditions, the Whittle pseudo...

  16. Integrated Main Propulsion System Performance Reconstruction Process/Models

    Science.gov (United States)

    Lopez, Eduardo; Elliott, Katie; Snell, Steven; Evans, Michael

    2013-01-01

    The Integrated Main Propulsion System (MPS) Performance Reconstruction process provides the MPS post-flight data files needed for postflight reporting to the project integration management and key customers to verify flight performance. This process/model was used as the baseline for the currently ongoing Space Launch System (SLS) work. The process utilizes several methodologies, including multiple software programs, to model integrated propulsion system performance through space shuttle ascent. It is used to evaluate integrated propulsion systems, including propellant tanks, feed systems, rocket engine, and pressurization systems performance throughout ascent based on flight pressure and temperature data. The latest revision incorporates new methods based on main engine power balance model updates to model higher mixture ratio operation at lower engine power levels.

  17. Detection of a novel, integrative aging process suggests complex physiological integration.

    Science.gov (United States)

    Cohen, Alan A; Milot, Emmanuel; Li, Qing; Bergeron, Patrick; Poirier, Roxane; Dusseault-Bélanger, Francis; Fülöp, Tamàs; Leroux, Maxime; Legault, Véronique; Metter, E Jeffrey; Fried, Linda P; Ferrucci, Luigi

    2015-01-01

    Many studies of aging examine biomarkers one at a time, but complex systems theory and network theory suggest that interpretations of individual markers may be context-dependent. Here, we attempted to detect underlying processes governing the levels of many biomarkers simultaneously by applying principal components analysis to 43 common clinical biomarkers measured longitudinally in 3694 humans from three longitudinal cohort studies on two continents (Women's Health and Aging I & II, InCHIANTI, and the Baltimore Longitudinal Study on Aging). The first axis was associated with anemia, inflammation, and low levels of calcium and albumin. The axis structure was precisely reproduced in all three populations and in all demographic sub-populations (by sex, race, etc.); we call the process represented by the axis "integrated albunemia." Integrated albunemia increases and accelerates with age in all populations, and predicts mortality and frailty--but not chronic disease--even after controlling for age. This suggests a role in the aging process, though causality is not yet clear. Integrated albunemia behaves more stably across populations than its component biomarkers, and thus appears to represent a higher-order physiological process emerging from the structure of underlying regulatory networks. If this is correct, detection of this process has substantial implications for physiological organization more generally.

  18. Detection of a novel, integrative aging process suggests complex physiological integration.

    Directory of Open Access Journals (Sweden)

    Alan A Cohen

    Full Text Available Many studies of aging examine biomarkers one at a time, but complex systems theory and network theory suggest that interpretations of individual markers may be context-dependent. Here, we attempted to detect underlying processes governing the levels of many biomarkers simultaneously by applying principal components analysis to 43 common clinical biomarkers measured longitudinally in 3694 humans from three longitudinal cohort studies on two continents (Women's Health and Aging I & II, InCHIANTI, and the Baltimore Longitudinal Study on Aging. The first axis was associated with anemia, inflammation, and low levels of calcium and albumin. The axis structure was precisely reproduced in all three populations and in all demographic sub-populations (by sex, race, etc.; we call the process represented by the axis "integrated albunemia." Integrated albunemia increases and accelerates with age in all populations, and predicts mortality and frailty--but not chronic disease--even after controlling for age. This suggests a role in the aging process, though causality is not yet clear. Integrated albunemia behaves more stably across populations than its component biomarkers, and thus appears to represent a higher-order physiological process emerging from the structure of underlying regulatory networks. If this is correct, detection of this process has substantial implications for physiological organization more generally.

  19. Psychological needs and the facilitation of integrative processes.

    Science.gov (United States)

    Ryan, R M

    1995-09-01

    The assumption that there are innate integrative or actualizing tendencies underlying personality and social development is reexamined. Rather than viewing such processes as either nonexistent or as automatic, I argue that they are dynamic and dependent upon social-contextual supports pertaining to basic human psychological needs. To develop this viewpoint, I conceptually link the notion of integrative tendencies to specific developmental processes, namely intrinsic motivation; internalization; and emotional integration. These processes are then shown to be facilitated by conditions that fulfill psychological needs for autonomy, competence, and relatedness, and forestalled within contexts that frustrate these needs. Interactions between psychological needs and contextual supports account, in part, for the domain and situational specificity of motivation, experience, and relative integration. The meaning of psychological needs (vs. wants) is directly considered, as are the relations between concepts of integration and autonomy and those of independence, individualism, efficacy, and cognitive models of "multiple selves."

  20. Pedagogic process modeling: Humanistic-integrative approach

    Directory of Open Access Journals (Sweden)

    Boritko Nikolaj M.

    2007-01-01

    Full Text Available The paper deals with some current problems of modeling the dynamics of the subject-features development of the individual. The term "process" is considered in the context of the humanistic-integrative approach, in which the principles of self education are regarded as criteria for efficient pedagogic activity. Four basic characteristics of the pedagogic process are pointed out: intentionality reflects logicality and regularity of the development of the process; discreteness (stageability in dicates qualitative stages through which the pedagogic phenomenon passes; nonlinearity explains the crisis character of pedagogic processes and reveals inner factors of self-development; situationality requires a selection of pedagogic conditions in accordance with the inner factors, which would enable steering the pedagogic process. Offered are two steps for singling out a particular stage and the algorithm for developing an integrative model for it. The suggested conclusions might be of use for further theoretic research, analyses of educational practices and for realistic predicting of pedagogical phenomena. .

  1. Integrating Thermal Tools Into the Mechanical Design Process

    Science.gov (United States)

    Tsuyuki, Glenn T.; Siebes, Georg; Novak, Keith S.; Kinsella, Gary M.

    1999-01-01

    The intent of mechanical design is to deliver a hardware product that meets or exceeds customer expectations, while reducing cycle time and cost. To this end, an integrated mechanical design process enables the idea of parallel development (concurrent engineering). This represents a shift from the traditional mechanical design process. With such a concurrent process, there are significant issues that have to be identified and addressed before re-engineering the mechanical design process to facilitate concurrent engineering. These issues also assist in the integration and re-engineering of the thermal design sub-process since it resides within the entire mechanical design process. With these issues in mind, a thermal design sub-process can be re-defined in a manner that has a higher probability of acceptance, thus enabling an integrated mechanical design process. However, the actual implementation is not always problem-free. Experience in applying the thermal design sub-process to actual situations provides the evidence for improvement, but more importantly, for judging the viability and feasibility of the sub-process.

  2. Moving the boundary between wavelength resources in optical packet and circuit integrated ring network.

    Science.gov (United States)

    Furukawa, Hideaki; Miyazawa, Takaya; Wada, Naoya; Harai, Hiroaki

    2014-01-13

    Optical packet and circuit integrated (OPCI) networks provide both optical packet switching (OPS) and optical circuit switching (OCS) links on the same physical infrastructure using a wavelength multiplexing technique in order to deal with best-effort services and quality-guaranteed services. To immediately respond to changes in user demand for OPS and OCS links, OPCI networks should dynamically adjust the amount of wavelength resources for each link. We propose a resource-adjustable hybrid optical packet/circuit switch and transponder. We also verify that distributed control of resource adjustments can be applied to the OPCI ring network testbed we developed. In cooperation with the resource adjustment mechanism and the hybrid switch and transponder, we demonstrate that automatically allocating a shared resource and moving the wavelength resource boundary between OPS and OCS links can be successfully executed, depending on the number of optical paths in use.

  3. First Zenith Total Delay and Integrated Water Vapour Estimates from the Near Real-Time GNSS Data Processing Systems at the University of Luxembourg

    Science.gov (United States)

    Ahmed, F.; Teferle, F. N.; Bingley, R. M.

    2012-04-01

    Since September 2011 the University of Luxembourg in collaboration with the University of Nottingham has been setting up two near real-time processing systems for ground-based GNSS data for the provision of zenith total delay (ZTD) and integrated water vapour (IWV) estimates. Both systems are based on Bernese v5.0, use the double-differenced network processing strategy and operate with a 1-hour (NRT1h) and 15-minutes (NRT15m) update cycle. Furthermore, the systems follow the approach of the E-GVAP METO and IES2 systems in that the normal equations for the latest data are combined with those from the previous four updates during the estimation of the ZTDs. NRT1h currently takes the hourly data from over 130 GNSS stations in Europe whereas NRT15m is primarily using the real-time streams of EUREF-IP. Both networks include additional GNSS stations in Luxembourg, Belgium and France. The a priori station coordinates for all of these stem from a moving average computed over the last 20 to 50 days and are based on the precise point positioning processing strategy. In this study we present the first ZTD and IWV estimates obtained from the NRT1h and NRT15m systems in development at the University of Luxembourg. In a preliminary evaluation we compare their performance to the IES2 system at the University of Nottingham and find the IWV estimates to agree at the sub-millimetre level.

  4. A refined approach: Saudi Arabia moves beyond crude

    International Nuclear Information System (INIS)

    Krane, Jim

    2015-01-01

    Saudi Arabia's role in global energy markets is changing. The kingdom is reshaping itself as a supplier of refined petroleum products while moving beyond its long-held role as a simple exporter of crude oil. This change is commensurate with the typical development trajectory of a state progressing to a more advanced stage of global economic integration. Gains from increased refining include reducing fuel imports and capturing margins now bequeathed to competitors. Refining also allows the kingdom to export its heavy crude oil to a wider array of customers, beyond select importers configured to handle heavy crudes. However, the move also presents strategic complications. The world's 'swing supplier' of oil may grow less willing or able to adjust supply to suit market demands. In the process, Saudi Arabia may have to update the old “oil for security” relationship that links it with Washington, augmenting it with a more diverse set of economic and investment ties with individual companies and countries, including China. -- Highlights: •Saudi Arabia is diverting crude oil into an expanding refining sector. •In doing so, the kingdom is moving beyond its role as global “swing supplier” of crude oil. •The kingdom will benefit from increased refining, including enhanced demand for heavy crude. •Strategic complications may force it to seek security partners beyond Washington

  5. Introduction: Integration as a three-way process approach?

    NARCIS (Netherlands)

    Garcés-Mascareñas, B.; Penninx, R.; Garcés-Mascareñas, B.; Penninx, R.

    2016-01-01

    This chapter introduces the topic of this volume, which is the recent departure from viewing integration as a strictly two-way process (between migrants and the receiving society) to acknowledge the potential role that countries of origin might play in support of the integration process. It traces

  6. Review on Biomass Torrefaction Process and Product Properties and Design of Moving Bed Torrefaction System Model Development

    Energy Technology Data Exchange (ETDEWEB)

    Jaya Shankar Tumuluru; Christopher T. Wright; Shahab Sokhansanj

    2011-08-01

    A Review on Torrefaction Process and Design of Moving Bed Torrefaction System for Biomass Processing Jaya Shankar Tumuluru1, Shahab Sokhansanj2 and Christopher T. Wright1 Idaho National Laboratory Biofuels and Renewable Energy Technologies Department Idaho Falls, Idaho 83415 Oak Ridge National Laboratory Bioenergy Resource and Engineering Systems Group Oak Ridge, TN 37831 Abstract Torrefaction is currently developing as an important preprocessing step to improve the quality of biomass in terms of physical properties, and proximate and ultimate composition. Torrefaction is a slow heating of biomass in an inert or reduced environment to a maximum temperature of 300 C. Torrefaction can also be defined as a group of products resulting from the partially controlled and isothermal pyrolysis of biomass occurring in a temperature range of 200-230 C and 270-280 C. Thus, the process can also be called a mild pyrolysis as it occurs at the lower temperature range of the pyrolysis process. At the end of the torrefaction process, a solid uniform product with lower moisture content and higher energy content than raw biomass is produced. Most of the smoke-producing compounds and other volatiles are removed during torrefaction, which produces a final product that will have a lower mass but a higher heating value. There is a lack of literature on the design aspects of torrefaction reactor and a design sheet for estimating the dimensions of the torrefier based on capacity. This study includes (a) conducting a detailed review on the torrefaction of biomass in terms of understanding the process, product properties, off-gas compositions, and methods used, and (b) to design a moving bed torrefier, taking into account the basic fundamental heat and mass transfer calculations. Specific objectives include calculating the dimensions like diameter and height of the moving packed bed torrefier for different capacities ranging from 25-1000 kg/hr, designing the heat loads and gas flow rates, and

  7. An Indoor Continuous Positioning Algorithm on the Move by Fusing Sensors and Wi-Fi on Smartphones.

    Science.gov (United States)

    Li, Huaiyu; Chen, Xiuwan; Jing, Guifei; Wang, Yuan; Cao, Yanfeng; Li, Fei; Zhang, Xinlong; Xiao, Han

    2015-12-11

    Wi-Fi indoor positioning algorithms experience large positioning error and low stability when continuously positioning terminals that are on the move. This paper proposes a novel indoor continuous positioning algorithm that is on the move, fusing sensors and Wi-Fi on smartphones. The main innovative points include an improved Wi-Fi positioning algorithm and a novel positioning fusion algorithm named the Trust Chain Positioning Fusion (TCPF) algorithm. The improved Wi-Fi positioning algorithm was designed based on the properties of Wi-Fi signals on the move, which are found in a novel "quasi-dynamic" Wi-Fi signal experiment. The TCPF algorithm is proposed to realize the "process-level" fusion of Wi-Fi and Pedestrians Dead Reckoning (PDR) positioning, including three parts: trusted point determination, trust state and positioning fusion algorithm. An experiment is carried out for verification in a typical indoor environment, and the average positioning error on the move is 1.36 m, a decrease of 28.8% compared to an existing algorithm. The results show that the proposed algorithm can effectively reduce the influence caused by the unstable Wi-Fi signals, and improve the accuracy and stability of indoor continuous positioning on the move.

  8. Integration of drinking water treatment plant process models and emulated process automation software

    NARCIS (Netherlands)

    Worm, G.I.M.

    2012-01-01

    The objective of this research is to limit the risks of fully automated operation of drinking water treatment plants and to improve their operation by using an integrated system of process models and emulated process automation software. This thesis contains the design of such an integrated system.

  9. Process Integration Analysis of an Industrial Hydrogen Production Process

    OpenAIRE

    Stolten, Detlef; Grube, Thomas; Tock, Laurence; Maréchal, François; Metzger, Christian; Arpentinier, Philippe

    2010-01-01

    The energy efficiency of an industrial hydrogen production process using steam methane reforming (SMR) combined with the water gas shift reaction (WGS) is analyzed using process integration techniques based on heat cascade calculation and pinch analysis with the aim of identifying potential measures to enhance the process performance. The challenge is to satisfy the high temperature heat demand of the SMR reaction by minimizing the consumption of natural gas to feed the combustion and to expl...

  10. Optimal integration of organic Rankine cycles with industrial processes

    International Nuclear Information System (INIS)

    Hipólito-Valencia, Brígido J.; Rubio-Castro, Eusiel; Ponce-Ortega, José M.; Serna-González, Medardo; Nápoles-Rivera, Fabricio; El-Halwagi, Mahmoud M.

    2013-01-01

    Highlights: • An optimization approach for heat integration is proposed. • A new general superstructure for heat integration is proposed. • Heat process streams are simultaneously integrated with an organic Rankine cycle. • Better results can be obtained respect to other previously reported methodologies. - Abstract: This paper presents a procedure for simultaneously handling the problem of optimal integration of regenerative organic Rankine cycles (ORCs) with overall processes. ORCs may allow the recovery of an important fraction of the low-temperature process excess heat (i.e., waste heat from industrial processes) in the form of mechanical energy. An integrated stagewise superstructure is proposed for representing the interconnections and interactions between the HEN and ORC for fixed data of process streams. Based on the integrated superstructure, the optimization problem is formulated as a mixed integer nonlinear programming problem to simultaneously account for the capital and operating costs including the revenue from the sale of the shaft power produced by the integrated system. The application of this method is illustrated with three example problems. Results show that the proposed procedure provides significantly better results than an earlier developed method for discovering optimal integrated systems using a sequential approach, due to the fact that it accounts simultaneously for the tradeoffs between the capital and operating costs as well as the sale of the produced energy. Also, the proposed method is an improvement over the previously reported methods for solving the synthesis problem of heat exchanger networks without the option of integration with an ORC (i.e., stand-alone heat exchanger networks)

  11. Quasi Ornstein-Uhlenbeck processes

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole Eiler; Basse-O'Connor, Andreas

    The question of existence and properties of stationary solutions to Langevin equations driven by noise processes with stationary increments is discussed, with particular focus on noise processes of pseudo moving average type. On account of the Wold-Karhunen decomposition theorem such solutions...... of the associated autocorrelation functions, both for small and large lags. Applications to Gaussian and Lévy driven fractional Ornstein-Uhlenbeck processes are presented. As an element in the derivations a Fubini theorem for Lévy bases is established....

  12. Quasi Ornstein-Uhlenbeck processes

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole Eiler; Basse-O'Connor, Andreas

    2011-01-01

    The question of existence and properties of stationary solutions to Langevin equations driven by noise processes with stationary increments is discussed, with particular focus on noise processes of pseudo-moving-average type. On account of the Wold–Karhunen decomposition theorem, such solutions are...... of the associated autocorrelation functions, both for small and large lags. Applications to Gaussian- and Lévy-driven fractional Ornstein–Uhlenbeck processes are presented. A Fubini theorem for Lévy bases is established as an element in the derivations....

  13. METHODOLOGY FRAMEWORK FOR PROCESS INTEGRATION AND SERVICE MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Darko Galinec

    2007-06-01

    Full Text Available History of information systems development was driven by business system's functions automation and mergers and acquisitions - business subjects integration into a whole. Modern business requires business processes integration through their dynamics and thus enterprise application integration (EAI as well. In this connection it is necessary to find ways and means of application integration and interaction in a consistent and reliable way. The real-time enterprise (RTE monitors, captures and analyzes root causes and overt events that are critical to its success the instant those events occur [6]. EAI is determined by business needs and business requirements. It must be based on business process repository and models, business integration methodology (BIM and information flow as well. Decisions concerning technology must be in function of successful application integration. In this paper EAI methodological framework and technological concepts for its achievements are introduced.

  14. Integration of process computer systems to Cofrentes NPP

    International Nuclear Information System (INIS)

    Saettone Justo, A.; Pindado Andres, R.; Buedo Jimenez, J.L.; Jimenez Fernandez-Sesma, A.; Delgado Muelas, J.A.

    1997-01-01

    The existence of three different process computer systems in Cofrentes NPP and the ageing of two of them have led to the need for their integration into a single real time computer system, known as Integrated ERIS-Computer System (SIEC), which covers the functionality of the three systems: Process Computer (PC), Emergency Response Information System (ERIS) and Nuclear Calculation Computer (OCN). The paper describes the integration project developed, which has essentially consisted in the integration of PC, ERIS and OCN databases into a single database, the migration of programs from the old process computer into the new SIEC hardware-software platform and the installation of a communications programme to transmit all necessary data for OCN programs from the SIEC computer, which in the new configuration is responsible for managing the databases of the whole system. (Author)

  15. An Integrated Desgin Process

    DEFF Research Database (Denmark)

    Petersen, Mads Dines; Knudstrup, Mary-Ann

    2010-01-01

    Present paper is placed in the discussion about how sustainable measures are integrated in the design process by architectural offices. It presents results from interviews with four leading Danish architectural offices working with sustainable architecture and their experiences with it, as well...... as the requirements they meet in terms of how to approach the design process – especially focused on the early stages like a competition. The interviews focus on their experiences with working in multidisciplinary teams and using digital tools to support their work with sustainable issues. The interviews show...... the environmental measures cannot be discarded due to extra costs....

  16. An Integrated Design Process

    DEFF Research Database (Denmark)

    Petersen, Mads Dines; Knudstrup, Mary-Ann

    2010-01-01

    Present paper is placed in the discussion about how sustainable measures are integrated in the design process by architectural offices. It presents results from interviews with four leading Danish architectural offices working with sustainable architecture and their experiences with it, as well...... as the requirements they meet in terms of how to approach the design process – especially focused on the early stages like a competition. The interviews focus on their experiences with working in multidisciplinary teams and using digital tools to support their work with sustainable issues. The interviews show...... the environmental measures cannot be discarded due to extra costs....

  17. Path Integral Formulation of Anomalous Diffusion Processes

    OpenAIRE

    Friedrich, Rudolf; Eule, Stephan

    2011-01-01

    We present the path integral formulation of a broad class of generalized diffusion processes. Employing the path integral we derive exact expressions for the path probability densities and joint probability distributions for the class of processes under consideration. We show that Continuous Time Random Walks (CTRWs) are included in our framework. A closed expression for the path probability distribution of CTRWs is found in terms of their waiting time distribution as the solution of a Dyson ...

  18. A path-integral approach to inclusive processes

    International Nuclear Information System (INIS)

    Sukumar, C.V.

    1995-01-01

    The cross section for an inclusive scattering process may be expressed in terms of a double path integral. Evaluation of the double path integral by the stationary-phase approximation yields classical equations of motion for the stationary trajectories and a classical cross section for the inclusive process which depends on the polarization of the initial state. Polarization analyzing powers are calculated from this theory and the results are compared with those obtained in an earlier paper. ((orig.))

  19. Integrating Usability Evaluations into the Software Development Process

    DEFF Research Database (Denmark)

    Lizano, Fulvio

    as relevant and strategic human–computer interaction (HCI) activities in the software development process, there are obstacles that limit the complete, effective and efficient integration of this kind of testing into the software development process. Two main obstacles are the cost of usability evaluations...... and the software developers' resistance to accepting users’ opinions regarding the lack of usability in their software systems. The ‘cost obstacle’ refers to the constraint of conducting usability evaluations in the software process due to the significant amount of resources required by this type of testing. Some......This thesis addresses the integration of usability evaluations into the software development process. The integration here is contextualized in terms of how to include usability evaluation as an activity in the software development lifecycle. Even though usability evaluations are considered...

  20. Conceptual information processing: A robust approach to KBS-DBMS integration

    Science.gov (United States)

    Lazzara, Allen V.; Tepfenhart, William; White, Richard C.; Liuzzi, Raymond

    1987-01-01

    Integrating the respective functionality and architectural features of knowledge base and data base management systems is a topic of considerable interest. Several aspects of this topic and associated issues are addressed. The significance of integration and the problems associated with accomplishing that integration are discussed. The shortcomings of current approaches to integration and the need to fuse the capabilities of both knowledge base and data base management systems motivates the investigation of information processing paradigms. One such paradigm is concept based processing, i.e., processing based on concepts and conceptual relations. An approach to robust knowledge and data base system integration is discussed by addressing progress made in the development of an experimental model for conceptual information processing.

  1. Autoregressive moving average (ARMA) model applied to quantification of cerebral blood flow using dynamic susceptibility contrast-enhanced magnetic resonance imaging

    International Nuclear Information System (INIS)

    Murase, Kenya; Yamazaki, Youichi; Shinohara, Masaaki

    2003-01-01

    The purpose of this study was to investigate the feasibility of the autoregressive moving average (ARMA) model for quantification of cerebral blood flow (CBF) with dynamic susceptibility contrast-enhanced magnetic resonance imaging (DSC-MRI) in comparison with deconvolution analysis based on singular value decomposition (DA-SVD). Using computer simulations, we generated a time-dependent concentration of the contrast agent in the volume of interest (VOI) from the arterial input function (AIF) modeled as a gamma-variate function under various CBFs, cerebral blood volumes and signal-to-noise ratios (SNRs) for three different types of residue function (exponential, triangular, and box-shaped). We also considered the effects of delay and dispersion in AIF. The ARMA model and DA-SVD were used to estimate CBF values from the simulated concentration-time curves in the VOI and AIFs, and the estimated values were compared with the assumed values. We found that the CBF value estimated by the ARMA model was more sensitive to the SNR and the delay in AIF than that obtained by DA-SVD. Although the ARMA model considerably overestimated CBF at low SNRs, it estimated the CBF more accurately than did DA-SVD at high SNRs for the exponential or triangular residue function. We believe this study will contribute to an understanding of the usefulness and limitations of the ARMA model when applied to quantification of CBF with DSC-MRI. (author)

  2. Integrated coherent matter wave circuits

    International Nuclear Information System (INIS)

    Ryu, C.; Boshier, M. G.

    2015-01-01

    An integrated coherent matter wave circuit is a single device, analogous to an integrated optical circuit, in which coherent de Broglie waves are created and then launched into waveguides where they can be switched, divided, recombined, and detected as they propagate. Applications of such circuits include guided atom interferometers, atomtronic circuits, and precisely controlled delivery of atoms. We report experiments demonstrating integrated circuits for guided coherent matter waves. The circuit elements are created with the painted potential technique, a form of time-averaged optical dipole potential in which a rapidly moving, tightly focused laser beam exerts forces on atoms through their electric polarizability. Moreover, the source of coherent matter waves is a Bose-Einstein condensate (BEC). Finally, we launch BECs into painted waveguides that guide them around bends and form switches, phase coherent beamsplitters, and closed circuits. These are the basic elements that are needed to engineer arbitrarily complex matter wave circuitry

  3. Amalgamation of Anomaly-Detection Indices for Enhanced Process Monitoring

    KAUST Repository

    Harrou, Fouzi

    2016-01-29

    Accurate and effective anomaly detection and diagnosis of modern industrial systems are crucial for ensuring reliability and safety and for maintaining desired product quality. Anomaly detection based on principal component analysis (PCA) has been studied intensively and largely applied to multivariate processes with highly cross-correlated process variables; howver conventional PCA-based methods often fail to detect small or moderate anomalies. In this paper, the proposed approach integrates two popular process-monitoring detection tools, the conventional PCA-based monitoring indices Hotelling’s T2 and Q and the exponentially weighted moving average (EWMA). We develop two EWMA tools based on the Q and T2 statistics, T2-EWMA and Q-EWMA, to detect anomalies in the process mean. The performances of the proposed methods were compared with that of conventional PCA-based anomaly-detection methods by applying each method to two examples: a synthetic data set and experimental data collected from a flow heating system. The results clearly show the benefits and effectiveness of the proposed methods over conventional PCA-based methods.

  4. Study on moving target detection to passive radar based on FM broadcast transmitter

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Target detection by a noncooperative illuminator is a topic of general interest in the electronic warfare field.First of all,direct-path interference(DPI)suppression which is the technique of bottleneck of moving target detection by a noncooperative frequency modulation(FM) broadcast transmitter is analyzed in this article;Secondly,a space-time-frequency domain synthetic solution to this problem is introduced:Adaptive nulling array processing is considered in the space domain,DPI cancellation based on adaptive fractional delay interpolation(AFDI)technique is used in planned time domain,and long-time coherent integration is utilized in the frequency domain;Finally,an experimental system is planned by considering FM broadcast transmitter as a noncooperative illuminator,Simulation results by real collected data show that the proposed method has a better performance of moving target detection.

  5. Exploration of Team Integration in Spanish Multifamily Residential Building Construction

    OpenAIRE

    Pellicer, Eugenio; Sanz Benlloch, María Amalia; Esmaeili, B.; MOLENAAR, KEITH ROBERT

    2016-01-01

    Project delivery team integration generally involves early involvement of general contractors and key specialty contractors in the design process. Team integration has been found to improve an owner’s probability of success. However, during difficult economic times, owners can forego early team involvement and move toward low bid procurement to take advantage of competitive markets. This study explores the performance of integrated teams in the Spanish multifamily building constructi...

  6. Business process management and IT management: The missing integration

    DEFF Research Database (Denmark)

    Rahimi, Fatemeh; Møller, Charles; Hvam, Lars

    2016-01-01

    of IT on process innovations, the association between business process management and IT management is under-explored. Drawing on a literature analysis of the capabilities of business process and IT governance frameworks and findings from a case study, we propose the need for horizontal integration between the two......The importance of business processes and the centrality of IT to contemporary organizations' performance calls for a specific focus on business process management and IT management. Despite the wide scope of business process management covering both business and IT domains, and the profound impact...... management functions to enable strategic and operational business - IT alignment. We further argue that the role of IT in an organization influences the direction of integration between the two functions and thus the choice of integration mechanisms. Using case study findings, we propose...

  7. Operating cost budgeting methods: quantitative methods to improve the process

    Directory of Open Access Journals (Sweden)

    José Olegário Rodrigues da Silva

    Full Text Available Abstract Operating cost forecasts are used in economic feasibility studies of projects and in budgeting process. Studies have pointed out that some companies are not satisfied with the budgeting process and chief executive officers want updates more frequently. In these cases, the main problem lies in the costs versus benefits. Companies seek simple and cheap forecasting methods without, at the same time, conceding in terms of quality of the resulting information. This study aims to compare operating cost forecasting models to identify the ones that are relatively easy to implement and turn out less deviation. For this purpose, we applied ARIMA (autoregressive integrated moving average and distributed dynamic lag models to data from a Brazilian petroleum company. The results suggest that the models have potential application, and that multivariate models fitted better and showed itself a better way to forecast costs than univariate models.

  8. AIM: An Integrated Approach to Organizational Improvement

    Directory of Open Access Journals (Sweden)

    Ronald A. Styron, Jr.

    2016-02-01

    Full Text Available This concept paper is based on the new problem-solving model of Blended Leadership called Alloy Improvement Model (AIM. This model consists of an integration of change theory, leadership theory, and democratic principles and practices to form a comprehensive problem-solving strategy for organizational leaders. The utilization of AIM will assist leaders in moving from problems to solutions while engaging stakeholders in a comprehensive, efficient, inclusive, informative, integrated and transparent process.

  9. Information Integration; The process of integration, evolution and versioning

    NARCIS (Netherlands)

    de Keijzer, Ander; van Keulen, Maurice

    2005-01-01

    At present, many information sources are available wherever you are. Most of the time, the information needed is spread across several of those information sources. Gathering this information is a tedious and time consuming job. Automating this process would assist the user in its task. Integration

  10. Real time radial and tangential tomosynthesis system dedicated to on line x-ray examination of moving objects

    International Nuclear Information System (INIS)

    Antonakios, M.; Rizo, Ph.; Lamarque, P.

    2000-01-01

    This presentation describes a system able to compute and display in real time a reconstructed image of a moving object using tomosynthesis methods. The object being moved on a known trajectory between the x-ray source and a detector, the tomosynthesis is focused on a given surface of the object and allows to reconstruct a sharp image of the structure on the surface superimposed to a blurred image of the surrounding plane. The developed tomosynthesis algorithm is based on a set of look up tables which provide for each position of the object on the trajectory, the projection of a given point of the imaged surface of the object on the detector. Several hundreds of frames can be combined to compute the tomosynthesis image. The signal-to-noise ratio obtained on processed images is equivalent to the one obtained by averaging images with a static object. In order to speed up the tomosynthesis reconstruction and to reach the video frame rate, we integrated a DSP based hardware in a PC host. The geometric calibration parameters and the look up tables are pre-computed on the PC. The on-line tomosynthesis calculation is carried out by the multi DSP architecture which manages in real time, frame acquisition, parallel tomosynthesis calculation and output image display. On this particular implementation of tomosynthesis, up to hundred video frames can be combined. We illustrate the potential of this system on an application of the tomosynthesis to solid rocket motor examination

  11. Reliable classification of moving waste materials with LIBS in concrete recycling.

    Science.gov (United States)

    Xia, Han; Bakker, M C M

    2014-03-01

    Effective discrimination between different waste materials is of paramount importance for inline quality inspection of recycle concrete aggregates from demolished buildings. The moving targeted materials in the concrete waste stream are wood, PVC, gypsum block, glass, brick, steel rebar, aggregate and cement paste. For each material, up to three different types were considered, while thirty particles of each material were selected. Proposed is a reliable classification methodology based on integration of the LIBS spectral emissions in a fixed time window, starting from the deployment of the laser shot. PLS-DA (multi class) and the hybrid combination PCA-Adaboost (binary class) were investigated as efficient classifiers. In addition, mean centre and auto scaling approaches were compared for both classifiers. Using 72 training spectra and 18 test spectra per material, each averaged by ten shots, only PLS-DA achieved full discrimination, and the mean centre approach made it slightly more robust. Continuing with PLS-DA, the relation between data averaging and convergence to 0.3% average error was investigated using 9-fold cross-validations. Single-shot PLS-DA presented the highest challenge and most desirable methodology, which converged with 59 PC. The degree of success in practical testing will depend on the quality of the training set and the implications of the possibly remaining false positives. © 2013 Published by Elsevier B.V.

  12. Dog days of summer: Influences on decision of wolves to move pups

    Science.gov (United States)

    Ausband, David E.; Mitchell, Michael S.; Bassing, Sarah B.; Nordhagen, Matthew; Smith, Douglas W.; Stahler, Daniel R.

    2016-01-01

    For animals that forage widely, protecting young from predation can span relatively long time periods due to the inability of young to travel with and be protected by their parents. Moving relatively immobile young to improve access to important resources, limit detection of concentrated scent by predators, and decrease infestations by ectoparasites can be advantageous. Moving young, however, can also expose them to increased mortality risks (e.g., accidents, getting lost, predation). For group-living animals that live in variable environments and care for young over extended time periods, the influence of biotic factors (e.g., group size, predation risk) and abiotic factors (e.g., temperature and precipitation) on the decision to move young is unknown. We used data from 25 satellite-collared wolves ( Canis lupus ) in Idaho, Montana, and Yellowstone National Park to evaluate how these factors could influence the decision to move pups during the pup-rearing season. We hypothesized that litter size, the number of adults in a group, and perceived predation risk would positively affect the number of times gray wolves moved pups. We further hypothesized that wolves would move their pups more often when it was hot and dry to ensure sufficient access to water. Contrary to our hypothesis, monthly temperature above the 30-year average was negatively related to the number of times wolves moved their pups. Monthly precipitation above the 30-year average, however, was positively related to the amount of time wolves spent at pup-rearing sites after leaving the natal den. We found little relationship between risk of predation (by grizzly bears, humans, or conspecifics) or group and litter sizes and number of times wolves moved their pups. Our findings suggest that abiotic factors most strongly influence the decision of wolves to move pups, although responses to unpredictable biotic events (e.g., a predator encountering pups) cannot be ruled out.

  13. THE VELOCITY DISTRIBUTION OF NEARBY STARS FROM HIPPARCOS DATA. II. THE NATURE OF THE LOW-VELOCITY MOVING GROUPS

    International Nuclear Information System (INIS)

    Bovy, Jo; Hogg, David W.

    2010-01-01

    The velocity distribution of nearby stars (∼<100 pc) contains many overdensities or 'moving groups', clumps of comoving stars, that are inconsistent with the standard assumption of an axisymmetric, time-independent, and steady-state Galaxy. We study the age and metallicity properties of the low-velocity moving groups based on the reconstruction of the local velocity distribution in Paper I of this series. We perform stringent, conservative hypothesis testing to establish for each of these moving groups whether it could conceivably consist of a coeval population of stars. We conclude that they do not: the moving groups are neither trivially associated with their eponymous open clusters nor with any other inhomogeneous star formation event. Concerning a possible dynamical origin of the moving groups, we test whether any of the moving groups has a higher or lower metallicity than the background population of thin disk stars, as would generically be the case if the moving groups are associated with resonances of the bar or spiral structure. We find clear evidence that the Hyades moving group has higher than average metallicity and weak evidence that the Sirius moving group has lower than average metallicity, which could indicate that these two groups are related to the inner Lindblad resonance of the spiral structure. Further, we find weak evidence that the Hercules moving group has higher than average metallicity, as would be the case if it is associated with the bar's outer Lindblad resonance. The Pleiades moving group shows no clear metallicity anomaly, arguing against a common dynamical origin for the Hyades and Pleiades groups. Overall, however, the moving groups are barely distinguishable from the background population of stars, raising the likelihood that the moving groups are associated with transient perturbations.

  14. Integrating angle-frequency domain synchronous averaging technique with feature extraction for gear fault diagnosis

    Science.gov (United States)

    Zhang, Shengli; Tang, J.

    2018-01-01

    Gear fault diagnosis relies heavily on the scrutiny of vibration responses measured. In reality, gear vibration signals are noisy and dominated by meshing frequencies as well as their harmonics, which oftentimes overlay the fault related components. Moreover, many gear transmission systems, e.g., those in wind turbines, constantly operate under non-stationary conditions. To reduce the influences of non-synchronous components and noise, a fault signature enhancement method that is built upon angle-frequency domain synchronous averaging is developed in this paper. Instead of being averaged in the time domain, the signals are processed in the angle-frequency domain to solve the issue of phase shifts between signal segments due to uncertainties caused by clearances, input disturbances, and sampling errors, etc. The enhanced results are then analyzed through feature extraction algorithms to identify the most distinct features for fault classification and identification. Specifically, Kernel Principal Component Analysis (KPCA) targeting at nonlinearity, Multilinear Principal Component Analysis (MPCA) targeting at high dimensionality, and Locally Linear Embedding (LLE) targeting at local similarity among the enhanced data are employed and compared to yield insights. Numerical and experimental investigations are performed, and the results reveal the effectiveness of angle-frequency domain synchronous averaging in enabling feature extraction and classification.

  15. A depth semi-averaged model for coastal dynamics

    Science.gov (United States)

    Antuono, M.; Colicchio, G.; Lugni, C.; Greco, M.; Brocchini, M.

    2017-05-01

    The present work extends the semi-integrated method proposed by Antuono and Brocchini ["Beyond Boussinesq-type equations: Semi-integrated models for coastal dynamics," Phys. Fluids 25(1), 016603 (2013)], which comprises a subset of depth-averaged equations (similar to Boussinesq-like models) and a Poisson equation that accounts for vertical dynamics. Here, the subset of depth-averaged equations has been reshaped in a conservative-like form and both the Poisson equation formulations proposed by Antuono and Brocchini ["Beyond Boussinesq-type equations: Semi-integrated models for coastal dynamics," Phys. Fluids 25(1), 016603 (2013)] are investigated: the former uses the vertical velocity component (formulation A) and the latter a specific depth semi-averaged variable, ϒ (formulation B). Our analyses reveal that formulation A is prone to instabilities as wave nonlinearity increases. On the contrary, formulation B allows an accurate, robust numerical implementation. Test cases derived from the scientific literature on Boussinesq-type models—i.e., solitary and Stokes wave analytical solutions for linear dispersion and nonlinear evolution and experimental data for shoaling properties—are used to assess the proposed solution strategy. It is found that the present method gives reliable predictions of wave propagation in shallow to intermediate waters, in terms of both semi-averaged variables and conservation properties.

  16. Structural Model of the Relationships among Cognitive Processes, Visual Motor Integration, and Academic Achievement in Students with Mild Intellectual Disability (MID)

    Science.gov (United States)

    Taha, Mohamed Mostafa

    2016-01-01

    This study aimed to test a proposed structural model of the relationships and existing paths among cognitive processes (attention and planning), visual motor integration, and academic achievement in reading, writing, and mathematics. The study sample consisted of 50 students with mild intellectual disability or MID. The average age of these…

  17. Towards Slow-Moving Landslide Monitoring by Integrating Multi-Sensor InSAR Time Series Datasets: The Zhouqu Case Study, China

    Directory of Open Access Journals (Sweden)

    Qian Sun

    2016-11-01

    Full Text Available Although the past few decades have witnessed the great development of Synthetic Aperture Radar Interferometry (InSAR technology in the monitoring of landslides, such applications are limited by geometric distortions and ambiguity of 1D Line-Of-Sight (LOS measurements, both of which are the fundamental weakness of InSAR. Integration of multi-sensor InSAR datasets has recently shown its great potential in breaking through the two limits. In this study, 16 ascending images from the Advanced Land Observing Satellite (ALOS and 18 descending images from the Environmental Satellite (ENVISAT have been integrated to characterize and to detect the slow-moving landslides in Zhouqu, China between 2008 and 2010. Geometric distortions are first mapped by using the imaging geometric parameters of the used SAR data and public Digital Elevation Model (DEM data of Zhouqu, which allow the determination of the most appropriate data assembly for a particular slope. Subsequently, deformation rates along respective LOS directions of ALOS ascending and ENVISAT descending tracks are estimated by conducting InSAR time series analysis with a Temporarily Coherent Point (TCP-InSAR algorithm. As indicated by the geometric distortion results, 3D deformation rates of the Xieliupo slope at the east bank of the Pai-lung River are finally reconstructed by joint exploiting of the LOS deformation rates from cross-heading datasets based on the surface–parallel flow assumption. It is revealed that the synergistic results of ALOS and ENVISAT datasets provide a more comprehensive understanding and monitoring of the slow-moving landslides in Zhouqu.

  18. Fourier path-integral Monte Carlo methods: Partial averaging

    International Nuclear Information System (INIS)

    Doll, J.D.; Coalson, R.D.; Freeman, D.L.

    1985-01-01

    Monte Carlo Fourier path-integral techniques are explored. It is shown that fluctuation renormalization techniques provide an effective means for treating the effects of high-order Fourier contributions. The resulting formalism is rapidly convergent, is computationally convenient, and has potentially useful variational aspects

  19. An Indoor Continuous Positioning Algorithm on the Move by Fusing Sensors and Wi-Fi on Smartphones

    Directory of Open Access Journals (Sweden)

    Huaiyu Li

    2015-12-01

    Full Text Available Wi-Fi indoor positioning algorithms experience large positioning error and low stability when continuously positioning terminals that are on the move. This paper proposes a novel indoor continuous positioning algorithm that is on the move, fusing sensors and Wi-Fi on smartphones. The main innovative points include an improved Wi-Fi positioning algorithm and a novel positioning fusion algorithm named the Trust Chain Positioning Fusion (TCPF algorithm. The improved Wi-Fi positioning algorithm was designed based on the properties of Wi-Fi signals on the move, which are found in a novel “quasi-dynamic” Wi-Fi signal experiment. The TCPF algorithm is proposed to realize the “process-level” fusion of Wi-Fi and Pedestrians Dead Reckoning (PDR positioning, including three parts: trusted point determination, trust state and positioning fusion algorithm. An experiment is carried out for verification in a typical indoor environment, and the average positioning error on the move is 1.36 m, a decrease of 28.8% compared to an existing algorithm. The results show that the proposed algorithm can effectively reduce the influence caused by the unstable Wi-Fi signals, and improve the accuracy and stability of indoor continuous positioning on the move.

  20. Maximum likelihood estimation for integrated diffusion processes

    DEFF Research Database (Denmark)

    Baltazar-Larios, Fernando; Sørensen, Michael

    We propose a method for obtaining maximum likelihood estimates of parameters in diffusion models when the data is a discrete time sample of the integral of the process, while no direct observations of the process itself are available. The data are, moreover, assumed to be contaminated...... EM-algorithm to obtain maximum likelihood estimates of the parameters in the diffusion model. As part of the algorithm, we use a recent simple method for approximate simulation of diffusion bridges. In simulation studies for the Ornstein-Uhlenbeck process and the CIR process the proposed method works...... by measurement errors. Integrated volatility is an example of this type of observations. Another example is ice-core data on oxygen isotopes used to investigate paleo-temperatures. The data can be viewed as incomplete observations of a model with a tractable likelihood function. Therefore we propose a simulated...

  1. An integrated computer aided system for integrated design of chemical processes

    DEFF Research Database (Denmark)

    Gani, Rafiqul; Hytoft, Glen; Jaksland, Cecilia

    1997-01-01

    In this paper, an Integrated Computer Aided System (ICAS), which is particularly suitable for solving problems related to integrated design of chemical processes; is presented. ICAS features include a model generator (generation of problem specific models including model simplification and model ...... form the basis for the toolboxes. The available features of ICAS are highlighted through a case study involving the separation of binary azeotropic mixtures. (C) 1997 Elsevier Science Ltd....

  2. Trends And Economic Assessment Of Integration Processes At The Metal Market

    Directory of Open Access Journals (Sweden)

    Olga Aleksandrovna Romanova

    2015-03-01

    Full Text Available The article discussed the integration process from the perspective of three dimensions that characterize the corresponding increase in the number and appearance of new relationships; strength, character, and stability of emerging communications; dynamics and the appropriate form of the process. In the article, trends of development of integration processes in metallurgy are identified, identification of five stages of development in Russian metal trading are justified. We propose a step by step way to implement the integration process, developed a methodical approach to assessing the feasibility of economic integration processes steel producers and steel traders, including three consecutive stages of its implementing respectively, the principles of reflexive control, entropy approach, the traditional assessment of mergers and acquisitions. The algorithm for the practical realization of the author’s approach, which allows to identify the optimal trajectory of the integration process as a series of horizontal and vertical integration steps is developed.

  3. Coal gasification by indirect heating in a single moving bed reactor: Process development & simulation

    Directory of Open Access Journals (Sweden)

    Junaid Akhlas

    2015-10-01

    Full Text Available In this work, the development and simulation of a new coal gasification process with indirect heat supply is performed. In this way, the need of pure oxygen production as in a conventional gasification process is avoided. The feasibility and energetic self-sufficiency of the proposed processes are addressed. To avoid the need of Air Separation Unit, the heat required by gasification reactions is supplied by the combustion flue gases, and transferred to the reacting mixture through a bayonet heat exchanger installed inside the gasifier. Two alternatives for the flue gas generation have been investigated and compared. The proposed processes are modeled using chemical kinetics validated on experimental gasification data by means of a standard process simulator (Aspen PlusTM, integrated with a spreadsheet for the modeling of a special type of heat exchanger. Simulation results are presented and discussed for proposed integrated process schemes. It is shown that they do not need external energy supply and ensure overall efficiencies comparable to conventional processes while producing syngas with lower content of carbon dioxide.

  4. Integration of e-learning outcomes into work processes

    Directory of Open Access Journals (Sweden)

    Kerstin Grundén

    2011-07-01

    Full Text Available Three case studies of in-house developed e-learning education in public organizations with different pedagogical approaches are used as a starting point for discussion regarding the implementation challenges of e-learning at work. The aim of this article is to contribute to the understanding of integrating mechanisms of e-learning outcomes into work processes in large, public organizations. The case studies were analyzed from a socio-cultural perspective using the MOA-model as a frame of reference. Although the pedagogical approaches for all of the cases seemed to be relevant and most of the learners showed overall positive attitudes towards the courses, there were problems with integration of the e-learning outcomes into work processes. There were deficiencies in the adaption of the course contents to the local educational needs. There was also a lack of adjusting the local work organization and work routines in order to facilitate the integration of the e-learning outcomes into the work processes. A lack of local management engagement affected the learners’ motivation negatively. Group discussions in local work groups facilitated the integration of the e-learning outcomes. Much of the difficulties of integrating e-learning outcomes into work processes in big organizations are related to the problems with adjusting centrally developed e-learning courses to local needs and a lack of co-operation among among the developers (often IT-professionals and the Human Resources Department of the organizations.

  5. Poisson processes and a Bessel function integral

    NARCIS (Netherlands)

    Steutel, F.W.

    1985-01-01

    The probability of winning a simple game of competing Poisson processes turns out to be equal to the well-known Bessel function integral J(x, y) (cf. Y. L. Luke, Integrals of Bessel Functions, McGraw-Hill, New York, 1962). Several properties of J, some of which seem to be new, follow quite easily

  6. Diagnosis about Integration process of youth foreign Catalan

    Directory of Open Access Journals (Sweden)

    Marta Sabariego Puig

    2015-09-01

    Full Text Available This paper presents the results of a diagnostic study on the real integration process of young migrants in Catalonia. The study was carried out using a descriptive survey of 3,830 young Catalans from varying cultural backgrounds aged between 14 and 18, with the aim of identifying the key factors influencing the integration of Catalan migrant youth. Also, in order to analyze these key elements in greater depth as factors easing and/or obstructing integration, four discussion groups were held with the same young people. Results reveal achievements and challenges for further study, useful for the design of social and educational policies which may promote the integration process, understood in its structural, social, cognitive-cultural and identitary dimensions. Our study confirms the need for a society with pluralistic beliefs, principles and actions, which should be reflected in democratic systems and social and educational policies based on the concept of integration as reciprocity.

  7. Carbon Nanotube Integration with a CMOS Process

    Science.gov (United States)

    Perez, Maximiliano S.; Lerner, Betiana; Resasco, Daniel E.; Pareja Obregon, Pablo D.; Julian, Pedro M.; Mandolesi, Pablo S.; Buffa, Fabian A.; Boselli, Alfredo; Lamagna, Alberto

    2010-01-01

    This work shows the integration of a sensor based on carbon nanotubes using CMOS technology. A chip sensor (CS) was designed and manufactured using a 0.30 μm CMOS process, leaving a free window on the passivation layer that allowed the deposition of SWCNTs over the electrodes. We successfully investigated with the CS the effect of humidity and temperature on the electrical transport properties of SWCNTs. The possibility of a large scale integration of SWCNTs with CMOS process opens a new route in the design of more efficient, low cost sensors with high reproducibility in their manufacture. PMID:22319330

  8. Stochastic modelling of the monthly average maximum and minimum temperature patterns in India 1981-2015

    Science.gov (United States)

    Narasimha Murthy, K. V.; Saravana, R.; Vijaya Kumar, K.

    2018-04-01

    The paper investigates the stochastic modelling and forecasting of monthly average maximum and minimum temperature patterns through suitable seasonal auto regressive integrated moving average (SARIMA) model for the period 1981-2015 in India. The variations and distributions of monthly maximum and minimum temperatures are analyzed through Box plots and cumulative distribution functions. The time series plot indicates that the maximum temperature series contain sharp peaks in almost all the years, while it is not true for the minimum temperature series, so both the series are modelled separately. The possible SARIMA model has been chosen based on observing autocorrelation function (ACF), partial autocorrelation function (PACF), and inverse autocorrelation function (IACF) of the logarithmic transformed temperature series. The SARIMA (1, 0, 0) × (0, 1, 1)12 model is selected for monthly average maximum and minimum temperature series based on minimum Bayesian information criteria. The model parameters are obtained using maximum-likelihood method with the help of standard error of residuals. The adequacy of the selected model is determined using correlation diagnostic checking through ACF, PACF, IACF, and p values of Ljung-Box test statistic of residuals and using normal diagnostic checking through the kernel and normal density curves of histogram and Q-Q plot. Finally, the forecasting of monthly maximum and minimum temperature patterns of India for the next 3 years has been noticed with the help of selected model.

  9. Strategies to Move From Conceptual Models to Quantifying Resilience in FEW Systems

    Science.gov (United States)

    Padowski, J.; Adam, J. C.; Boll, J.; Barber, M. E.; Cosens, B.; Goldsby, M.; Fortenbery, R.; Fowler, A.; Givens, J.; Guzman, C. D.; Hampton, S. E.; Harrison, J.; Huang, M.; Katz, S. L.; Kraucunas, I.; Kruger, C. E.; Liu, M.; Luri, M.; Malek, K.; Mills, A.; McLarty, D.; Pickering, N. B.; Rajagopalan, K.; Stockle, C.; Richey, A.; Voisin, N.; Witinok-Huber, B.; Yoder, J.; Yorgey, G.; Zhao, M.

    2017-12-01

    Understanding interdependencies within Food-Energy-Water (FEW) systems is critical to maintain FEW security. This project examines how coordinated management of physical (e.g., reservoirs, aquifers, and batteries) and non-physical (e.g., water markets, social capital, and insurance markets) storage systems across the three sectors promotes resilience. Coordination increases effective storage within the overall system and enhances buffering against shocks at multiple scales. System-wide resilience can be increased with innovations in technology (e.g., smart systems and energy storage) and institutions (e.g., economic systems and water law). Using the Columbia River Basin as our geographical study region, we use an integrated approach that includes a continuum of science disciplines, moving from theory to practice. In order to understand FEW linkages, we started with detailed, connected conceptual models of the food, energy, water, and social systems to identify where key interdependencies (i.e., overlaps, stocks, and flows) exist within and between systems. These are used to identify stress and opportunity points, develop innovation solutions across FEW sectors, remove barriers to the adoption of solutions, and quantify increases in system-wide resilience to regional and global change. The conceptual models act as a foundation from which we can identify key drivers, parameters, time steps, and variables of importance to build and improve existing systems dynamic and biophysical models. Our process of developing conceptual models and moving to integrated modeling is critical and serves as a foundation for coupling quantitative components with economic and social domain components and analyses of how these interact through time and space. This poster provides a description of this process that pulls together conceptual maps and integrated modeling output to quantify resilience across all three of the FEW sectors (a.k.a. "The Resilience Calculator"). Companion posters

  10. Integrating climate change adaptation into Dutch local policies and the role of contextual factors.

    NARCIS (Netherlands)

    van den Berg, Maya Marieke; Coenen, Franciscus H.J.M.

    2012-01-01

    Moving towards a more sustainable adaptation process requires closer integration of policies related to the environment. An important actor in this is the local government. This paper examines to what extend adaptation is currently being integrated into Dutch local policies, and what the role is of

  11. Integration of MGDS design into the licensing process

    International Nuclear Information System (INIS)

    1997-12-01

    This paper presents an overview of how the Mined Geologic Disposal System (MGDS) design for a potential repository is integrated into the licensing process. The integration process employs a two-told approach: (1) ensure that the MGDS design complies with applicable Nuclear Regulatory Commission (NRC) licensing requirements, and (2) ensure that the MGDS design is appropriately reflected in a license application that is acceptable to the NRC for performing acceptance and compliance reviews

  12. Plans, Patterns, and Move Categories Guiding a Highly Selective Search

    Science.gov (United States)

    Trippen, Gerhard

    In this paper we present our ideas for an Arimaa-playing program (also called a bot) that uses plans and pattern matching to guide a highly selective search. We restrict move generation to moves in certain move categories to reduce the number of moves considered by the bot significantly. Arimaa is a modern board game that can be played with a standard Chess set. However, the rules of the game are not at all like those of Chess. Furthermore, Arimaa was designed to be as simple and intuitive as possible for humans, yet challenging for computers. While all established Arimaa bots use alpha-beta search with a variety of pruning techniques and other heuristics ending in an extensive positional leaf node evaluation, our new bot, Rat, starts with a positional evaluation of the current position. Based on features found in the current position - supported by pattern matching using a directed position graph - our bot Rat decides which of a given set of plans to follow. The plan then dictates what types of moves can be chosen. This is another major difference from bots that generate "all" possible moves for a particular position. Rat is only allowed to generate moves that belong to certain categories. Leaf nodes are evaluated only by a straightforward material evaluation to help avoid moves that lose material. This highly selective search looks, on average, at only 5 moves out of 5,000 to over 40,000 possible moves in a middle game position.

  13. Business Process Management Integration Solution in Financial Sector

    Directory of Open Access Journals (Sweden)

    2009-01-01

    Full Text Available It is vital for financial services companies to ensure the rapid implementation of new processes to meet speed-to-market, service quality and compliance requirements. This has to be done against a background of increased complexity. An integrated approach to business processes allows products, processes, systems, data and the applications that underpin them to evolve quickly. Whether it’s providing a loan, setting up an insurance policy, or executing an investment instruction, optimizing the sale-to-fulfillment process will always win new business, cement customer loyalty, and reduce costs. Lack of integration across lending, payments and trading, on the other hand, simply presents competitors who are more efficient with a huge profit opportunity.

  14. Security management: a question of integration between people, systems and organizations

    International Nuclear Information System (INIS)

    Drukaroff, M. C.; Romano, A.

    2015-01-01

    Safety Management has always been the most important process of the Juzbado Factory since beginning of operations in 1985. this process has evolved, moving from focusing primary on preventive control of operation risks by means of adequate exploitation of Safety Systems, to integrating aspects related with Safety culture and Organization Factors, as key players in order to achieve sustainable safety improvements. This paper presents how Safety Management has evolved at the Factory, emphasizing especially on the integration process of the three main factors affecting safety: people, technology and organizations. (Author)

  15. Enterprise Architecture Integration in E-government

    NARCIS (Netherlands)

    Janssen, M.F.W.H.A.; Cresswell, A.

    2005-01-01

    Achieving goals of better integrated and responsive government services requires moving away from stand alone applications toward more comprehensive, integrated architectures. As a result there is mounting pressure to move from disparate systems operating in parallel toward a shared architecture

  16. The mechanism of development of integration processes in the region

    Directory of Open Access Journals (Sweden)

    V. M. Bautin

    2017-01-01

    Full Text Available In the context of the weakening economic development of the region, it is necessary to find new ways to increase the efficiency of interaction between the economic structures of the region. One of the areas is the development of integration processes in the field of cooperation between the public and private capital to meet the goals and objectives of the effective functioning of both the participants of integration interaction, as well as the region as a whole. Factors that influence the emergence and development of integration processes, are a scarce resource; motivated by the need to diversify the business; the desire to improve the economic efficiency of business entities. Development grace-integral process is economic interaction managing subjects, followed by combining them to achieve common objectives and obtain the synergistic effect due to a number of resource solutions, organizational and administrative problems. To obtain high economic benefits of integration interaction of the participants, we have proposed a mechanism for the development of integration processes in the region, based on three levels of interaction between regional authorities, educational institutions and private organizations. It allows forming a single chain integration and process management to increase the effectiveness of their implementation in practice, and to avoid the disadvantages associated with the formation of the integrated structures. Integration cooperation of regional authorities with organizations of various spheres of activity of the region and education (research organizations is a key component of the new Russian innovation policy because, if done right, it provides broader benefits from investments in research and development, creating favorable conditions for sustainable innovation development and is a strategic factor in the economic growth of the region.

  17. Estimation of direction of arrival of a moving target using subspace based approaches

    Science.gov (United States)

    Ghosh, Ripul; Das, Utpal; Akula, Aparna; Kumar, Satish; Sardana, H. K.

    2016-05-01

    In this work, array processing techniques based on subspace decomposition of signal have been evaluated for estimation of direction of arrival of moving targets using acoustic signatures. Three subspace based approaches - Incoherent Wideband Multiple Signal Classification (IWM), Least Square-Estimation of Signal Parameters via Rotation Invariance Techniques (LS-ESPRIT) and Total Least Square- ESPIRIT (TLS-ESPRIT) are considered. Their performance is compared with conventional time delay estimation (TDE) approaches such as Generalized Cross Correlation (GCC) and Average Square Difference Function (ASDF). Performance evaluation has been conducted on experimentally generated data consisting of acoustic signatures of four different types of civilian vehicles moving in defined geometrical trajectories. Mean absolute error and standard deviation of the DOA estimates w.r.t. ground truth are used as performance evaluation metrics. Lower statistical values of mean error confirm the superiority of subspace based approaches over TDE based techniques. Amongst the compared methods, LS-ESPRIT indicated better performance.

  18. Improving ISD Agility in Fast-Moving Software Organizations

    DEFF Research Database (Denmark)

    Persson, John Stouby; Nørbjerg, Jacob; Nielsen, Peter Axel

    2016-01-01

    Fast-moving software organizations must respond quickly to changing technological options and market trends while delivering high-quality services at competitive prices. Improving agility of information systems development (ISD) may reconcile these inherent tensions, but previous research...... study on how to improve ISD agility in a fast-moving software organization. The study maps central problems in the ISD management to direct improvements of agility. Our following intervention addressed method improvements in defining types of ISD by customer relations and integrating the method...... with the task management tool used by the organization. The paper discusses how the action research contributes to our understanding of ISD agility in fast-moving software organizations with a framework for mapping and evaluating improvements of agility. The action research specifically points out that project...

  19. Biodiesel production process from microalgae oil by waste heat recovery and process integration.

    Science.gov (United States)

    Song, Chunfeng; Chen, Guanyi; Ji, Na; Liu, Qingling; Kansha, Yasuki; Tsutsumi, Atsushi

    2015-10-01

    In this work, the optimization of microalgae oil (MO) based biodiesel production process is carried out by waste heat recovery and process integration. The exergy analysis of each heat exchanger presented an efficient heat coupling between hot and cold streams, thus minimizing the total exergy destruction. Simulation results showed that the unit production cost of optimized process is 0.592$/L biodiesel, and approximately 0.172$/L biodiesel can be avoided by heat integration. Although the capital cost of the optimized biodiesel production process increased 32.5% and 23.5% compared to the reference cases, the operational cost can be reduced by approximately 22.5% and 41.6%. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Research on Motivation in Collaborative Learning: Moving beyond the Cognitive-Situative Divide and Combining Individual and Social Processes

    Science.gov (United States)

    Jarvela, Sanna; Volet, Simone; Jarvenoja, Hanna

    2010-01-01

    In this article we propose that in order to advance our understanding of motivation in collaborative learning we should move beyond the cognitive-situative epistemological divide and combine individual and social processes. Our claim is that although recent research has recognized the importance of social aspects in emerging and sustained…

  1. Path planning for first responders in the presence of moving obstacles

    Directory of Open Access Journals (Sweden)

    Zhiyong Wang

    2015-06-01

    the above research questions, this research has been conducted using the following outline: 1. literature review; 2. conceptual design and analysis; 3. implementation of the prototype; and 4. assessment of the prototype and adaption. We investigated previous research related to navigation in disasters, and designed an integrated navigation system architecture, assisting responders in spatial data storage, processing and analysis.Within this architecture, we employ hazard models to provide the predicted information about the obstacles, and select a geo-database to store the data needed for emergency navigation. Throughout the development of the prototype navigation system, we have proposed: • a taxonomy of navigation among obstacles, which categorizes navigation cases on basis of type and multiplicity of first responders, destinations, and obstacles; • a multi-agent system, which supports information collection from hazard simulations, spatio-temporal data processing and analysis, connection with a geo-database, and route generation in dynamic environments affected by disasters; • data models, which structure the information required for finding paths among moving obstacles, capturing both static information, such as the type of the response team, the topology of the road network, and dynamic information, such as changing availabilities of roads during disasters, the uncertainty of the moving obstacles generated from hazard simulations, and the position of the vehicle; • path planning algorithms, which generate routes for one or more responders in the presence of moving obstacles. Using the speed of vehicles, departure time, and the predicted information about the state of the road network, etc., three versions (I, II, and III of Moving Obstacle Avoiding A* (MOAAStar algorithms are developed: 1. MOAAstar– I/Non-waiting, which supports path planning in the case of forest fires; 2. MOAAstar–II/Waiting, which introduces waiting options to avoid moving

  2. The integrity management cycle as a business process

    Energy Technology Data Exchange (ETDEWEB)

    Ackhurst, Trent B.; Peverelli, Romina P. [PIMS - Pipeline Integrity Management Specialists of London Ltd. (United Kingdom).

    2009-07-01

    It is a best-practice Oil and Gas pipeline integrity and reliability technique to apply integrity management cycles. This is conforms to the business principles of continuous improvement. This paper examines the integrity management cycle - both goals and objectives and subsequent component steps - from a business perspective. Traits that businesses require, to glean maximum benefit from such a cycle, are highlighted. A case study focuses upon an integrity and reliability process developed to apply to pipeline operators. installations. This is compared and contrasted to the pipeline integrity management cycle to underline both cycles. consistency with the principles of continuous improvement. (author)

  3. Neutron Thermal Cross Sections, Westcott Factors, Resonance Integrals, Maxwellian Averaged Cross Sections and Astrophysical Reaction Rates Calculated from the ENDF/B-VII.1, JEFF-3.1.2, JENDL-4.0, ROSFOND-2010, CENDL-3.1 and EAF-2010 Evaluated Data Libraries

    Science.gov (United States)

    Pritychenko, B.; Mughabghab, S. F.

    2012-12-01

    We present calculations of neutron thermal cross sections, Westcott factors, resonance integrals, Maxwellian-averaged cross sections and astrophysical reaction rates for 843 ENDF materials using data from the major evaluated nuclear libraries and European activation file. Extensive analysis of newly-evaluated neutron reaction cross sections, neutron covariances, and improvements in data processing techniques motivated us to calculate nuclear industry and neutron physics quantities, produce s-process Maxwellian-averaged cross sections and astrophysical reaction rates, systematically calculate uncertainties, and provide additional insights on currently available neutron-induced reaction data. Nuclear reaction calculations are discussed and new results are presented. Due to space limitations, the present paper contains only calculated Maxwellian-averaged cross sections and their uncertainties. The complete data sets for all results are published in the Brookhaven National Laboratory report.

  4. Online Self-Organizing Network Control with Time Averaged Weighted Throughput Objective

    Directory of Open Access Journals (Sweden)

    Zhicong Zhang

    2018-01-01

    Full Text Available We study an online multisource multisink queueing network control problem characterized with self-organizing network structure and self-organizing job routing. We decompose the self-organizing queueing network control problem into a series of interrelated Markov Decision Processes and construct a control decision model for them based on the coupled reinforcement learning (RL architecture. To maximize the mean time averaged weighted throughput of the jobs through the network, we propose a reinforcement learning algorithm with time averaged reward to deal with the control decision model and obtain a control policy integrating the jobs routing selection strategy and the jobs sequencing strategy. Computational experiments verify the learning ability and the effectiveness of the proposed reinforcement learning algorithm applied in the investigated self-organizing network control problem.

  5. Memory-type control charts for monitoring the process dispersion

    NARCIS (Netherlands)

    Abbas, N.; Riaz, M.; Does, R.J.M.M.

    2014-01-01

    Control charts have been broadly used for monitoring the process mean and dispersion. Cumulative sum (CUSUM) and exponentially weighted moving average (EWMA) control charts are memory control charts as they utilize the past information in setting up the control structure. This makes CUSUM and

  6. New approaches to improve a WCDMA SIR estimator by employing different post-processing stages

    Directory of Open Access Journals (Sweden)

    Amnart Chaichoet

    2008-09-01

    Full Text Available For effective control of transmission power in WCDMA mobile systems, a good estimate of signal-to-interference ratio (SIR is needed. Conventionally, an adaptive SIR estimator employs a moving average (MA filter (Yoon et al., 2002 to encounter fading channel distortion. However, the resulting estimate seems to have high estimation error due to fluctuation in the channel variation. In this paper, an additional post-processing stage is proposed to improve the estimation accuracy by reducing the variation of the estimate. Four variations of post-processing stages, namely 1 a moving average (MA postfilter,2 an exponential moving average (EMA post-filter, 3 an IIR post-filter and 4 least-mean-squared (LMS adaptive post-filter, are proposed and their optimal performance in terms of root-mean-square error (RMSE are then compared by simulation. The results show the best comparable performance when the MA and LMS post-filter are used. However, the MA post-filter requires a lookup table of filter order for optimal performance at different channel conditions, while the LMS post-filter can be used conveniently without a lookup table.

  7. THE ROLE OF ENTERPRISE PORTALS IN ENTERPRISE INTEGRATION

    Directory of Open Access Journals (Sweden)

    Gianina RIZESCU

    2006-01-01

    Full Text Available Today’s enterprises are moving business systems to the Internet - to connect people, business processes, and people to business processes in enterprise and across enterprise boundaries. The portal brings it all together: business processes, departmental sites, knowledge management resources, enterprise management systems, CRM systems, analytics, email, calendars, external content, transactions,administration, workflow, and more. The goal of this paper is to present the role of the Enterprise Portal in internal and external enterprise integration.

  8. High-throughput machining using a high-average power ultrashort pulse laser and high-speed polygon scanner

    Science.gov (United States)

    Schille, Joerg; Schneider, Lutz; Streek, André; Kloetzer, Sascha; Loeschner, Udo

    2016-09-01

    High-throughput ultrashort pulse laser machining is investigated on various industrial grade metals (aluminum, copper, and stainless steel) and Al2O3 ceramic at unprecedented processing speeds. This is achieved by using a high-average power picosecond laser in conjunction with a unique, in-house developed polygon mirror-based biaxial scanning system. Therefore, different concepts of polygon scanners are engineered and tested to find the best architecture for high-speed and precision laser beam scanning. In order to identify the optimum conditions for efficient processing when using high-average laser powers, the depths of cavities made in the samples by varying the processing parameter settings are analyzed and, from the results obtained, the characteristic removal values are specified. For overlapping pulses of optimum fluence, the removal rate is as high as 27.8 mm3/min for aluminum, 21.4 mm3/min for copper, 15.3 mm3/min for stainless steel, and 129.1 mm3/min for Al2O3, when a laser beam of 187 W average laser powers irradiates. On stainless steel, it is demonstrated that the removal rate increases to 23.3 mm3/min when the laser beam is very fast moving. This is thanks to the low pulse overlap as achieved with 800 m/s beam deflection speed; thus, laser beam shielding can be avoided even when irradiating high-repetitive 20-MHz pulses.

  9. Systems integration processes for space nuclear electric propulsion systems

    International Nuclear Information System (INIS)

    Olsen, C.S.; Rice, J.W.; Stanley, M.L.

    1991-01-01

    The various components and subsystems that comprise a nuclear electric propulsion system should be developed and integrated so that each functions ideally and so that each is properly integrated with the other components and subsystems in the optimum way. This paper discusses how processes similar to those used in the development and intergration of the subsystems that comprise the Multimegawatt Space Nuclear Power System concepts can be and are being efficiently and effectively utilized for these purposes. The processes discussed include the development of functional and operational requirements at the system and subsystem level; the assessment of individual nuclear power supply and thruster concepts and their associated technologies; the conduct of systems integration efforts including the evaluation of the mission benefits for each system; the identification and resolution of concepts development, technology development, and systems integration feasibility issues; subsystem, system, and technology development and integration; and ground and flight subsystem and integrated system testing

  10. A practical guide to averaging functions

    CERN Document Server

    Beliakov, Gleb; Calvo Sánchez, Tomasa

    2016-01-01

    This book offers an easy-to-use and practice-oriented reference guide to mathematical averages. It presents different ways of aggregating input values given on a numerical scale, and of choosing and/or constructing aggregating functions for specific applications. Building on a previous monograph by Beliakov et al. published by Springer in 2007, it outlines new aggregation methods developed in the interim, with a special focus on the topic of averaging aggregation functions. It examines recent advances in the field, such as aggregation on lattices, penalty-based aggregation and weakly monotone averaging, and extends many of the already existing methods, such as: ordered weighted averaging (OWA), fuzzy integrals and mixture functions. A substantial mathematical background is not called for, as all the relevant mathematical notions are explained here and reported on together with a wealth of graphical illustrations of distinct families of aggregation functions. The authors mainly focus on practical applications ...

  11. Space Medicine in the Human System Integration Process

    Science.gov (United States)

    Scheuring, Richard A.

    2010-01-01

    This slide presentation reviews the importance of integration of space medicine in the human system of lunar exploration. There is a review of historical precedence in reference to lunar surface operations. The integration process is reviewed in a chart which shows the steps from research to requirements development, requirements integration, design, verification, operations and using the lessons learned, giving more information and items for research. These steps are reviewed in view of specific space medical issues. Some of the testing of the operations are undertaken in an environment that is an analog to the exploration environment. Some of these analog environments are reviewed, and there is some discussion of the benefits of use of an analog environment in testing the processes that are derived.

  12. Value Creation through ICT Integration in Merger & Acquisition Processes

    DEFF Research Database (Denmark)

    Holm Larsen, Michael

    2005-01-01

    As deals are becoming more complex, and as technology, and the people supporting it, are becoming key drivers of merger and acquisition processes, planning of information and communication technologies in early stages of the integration process is vital to the realization of benefits of an Merger...... & Acquisition process. This statement is substantiated through review of literature from academics as well as practitioners, and case exemplifications of the financial service organization, the Nordea Group. Keywords: ICT Integration, Mergers & Acquisitions, Nordea Group....

  13. Moving beyond gender: processes that create relationship equality.

    Science.gov (United States)

    Knudson-Martin, Carmen; Mahoney, Anne Rankin

    2005-04-01

    Equality is related to relationship success, yet few couples achieve it. In this qualitative study, we examine how couples with children in two time cohorts (1982 and 2001) moved toward equality. The analysis identifies three types of couples: Postgender, gender legacy, and traditional. Movement toward equality is facilitated by: (a) Stimulus for change, including awareness of gender, commitment to family and work, and situational pressures; and (b) patterns that promote change, including active negotiation, challenges to gender entitlement, development of new competencies, and mutual attention to relationship and family tasks. Implications for practice are discussed.

  14. Polycation-mediated integrated cell death processes

    DEFF Research Database (Denmark)

    Parhamifar, Ladan; Andersen, Helene; Wu, Linping

    2014-01-01

    standard. PEIs are highly efficient transfectants, but depending on their architecture and size they induce cytotoxicity through different modes of cell death pathways. Here, we briefly review dynamic and integrated cell death processes and pathways, and discuss considerations in cell death assay design...

  15. Temperature distribution in a uniformly moving medium

    International Nuclear Information System (INIS)

    Mitchell, Joseph D; Petrov, Nikola P

    2009-01-01

    We apply several physical ideas to determine the steady temperature distribution in a medium moving with uniform velocity between two infinite parallel plates. We compute it in the coordinate frame moving with the medium by integration over the 'past' to account for the influence of an infinite set of instantaneous point sources of heat in past moments as seen by an observer moving with the medium. The boundary heat flux is simulated by appropriately distributed point heat sources on the inner side of an adiabatically insulating boundary. We make an extensive use of the Green functions with an emphasis on their physical meaning. The methodology used in this paper is of great pedagogical value as it offers an opportunity for students to see the connection between powerful mathematical techniques and their physical interpretation in an intuitively clear physical problem. We suggest several problems and a challenging project that can be easily incorporated in undergraduate or graduate courses

  16. MOVES - A tool for Modeling and Verification of Embedded Systems

    DEFF Research Database (Denmark)

    Ellebæk, Jens; Knudsen, Kristian S.; Brekling, Aske Wiid

    2007-01-01

    We demonstrate MOVES, a tool which allows designers of embedded systems to explore possible implementations early in the design process. The demonstration of MOVES will show how designers can explore different designs by changing the mapping of tasks on processing elements, the number and/or spee...... of processing elements, the size of local memories, and the operating systems (scheduling algorithm)....

  17. Thermal Analysis of a Cracked Half-plane under Moving Point Heat Source

    Directory of Open Access Journals (Sweden)

    He Kuanfang

    2017-09-01

    Full Text Available The heat conduction in half-plane with an insulated crack subjected to moving point heat source is investigated. The analytical solution and the numerical means are combined to analyze the transient temperature distribution of a cracked half-plane under moving point heat source. The transient temperature distribution of the half plane structure under moving point heat source is obtained by the moving coordinate method firstly, then the heat conduction equation with thermal boundary of an insulated crack face is changed to singular integral equation by applying Fourier transforms and solved by the numerical method. The numerical examples of the temperature distribution on the cracked half-plane structure under moving point heat source are presented and discussed in detail.

  18. Optimization of Moving Coil Actuators for Digital Displacement Machines

    DEFF Research Database (Denmark)

    Nørgård, Christian; Bech, Michael Møller; Roemer, Daniel Beck

    2016-01-01

    This paper focuses on deriving an optimal moving coil actuator design, used as force pro-ducing element in hydraulic on/off valves for Digital Displacement machines. Different moving coil actuator geometry topologies (permanent magnet placement and magnetiza-tion direction) are optimized for actu......This paper focuses on deriving an optimal moving coil actuator design, used as force pro-ducing element in hydraulic on/off valves for Digital Displacement machines. Different moving coil actuator geometry topologies (permanent magnet placement and magnetiza-tion direction) are optimized...... for actuating annular seat valves in a digital displacement machine. The optimization objectives are to the minimize the actuator power, the valve flow losses and the height of the actuator. Evaluation of the objective function involves static finite element simulation and simulation of an entire operation...... designs requires approximately 20 W on average and may be realized in 20 mm × Ø 22.5 mm (height × diameter) for a 20 kW pressure chamber. The optimization is carried out using the multi-objective Generalized Differential Evolu-tion optimization algorithm GDE3 which successfully handles constrained multi-objective...

  19. Moving Target Detection With Compact Laser Doppler Radar

    Science.gov (United States)

    Sepp, G.; Breining, A.; Eisfeld, W.; Knopp, R.; Lill, E.; Wagner, D.

    1989-12-01

    This paper describes an experimental integrated optronic system for detection and tracking of moving objects. The system is based on a CO2 waveguide laser Doppler ra-dar with homodyne receiver and galvanometer mirror beam scanner. A "hot spot" seeker consisting of a thermal imager with image processor transmits the coordinates of IR-emitting, i.e. potentially powered, objects to the laser radar scanner. The scanner addresses these "hot" locations operating in a large field-of-view (FOV) random ac-cess mode. Hot spots exhibiting a Doppler shifted laser signal are indicated in the thermal image by velocity-to-colour encoded markers. After switching to a small FOV scanning mode, the laser Doppler radar is used to track fast moving objects. Labora-tory and field experiments with moving objects including rotating discs, automobiles and missiles are described.

  20. A manufacturable process integration approach for graphene devices

    Science.gov (United States)

    Vaziri, Sam; Lupina, Grzegorz; Paussa, Alan; Smith, Anderson D.; Henkel, Christoph; Lippert, Gunther; Dabrowski, Jarek; Mehr, Wolfgang; Östling, Mikael; Lemme, Max C.

    2013-06-01

    In this work, we propose an integration approach for double gate graphene field effect transistors. The approach includes a number of process steps that are key for future integration of graphene in microelectronics: bottom gates with ultra-thin (2 nm) high-quality thermally grown SiO2 dielectrics, shallow trench isolation between devices and atomic layer deposited Al2O3 top gate dielectrics. The complete process flow is demonstrated with fully functional GFET transistors and can be extended to wafer scale processing. We assess, through simulation, the effects of the quantum capacitance and band bending in the silicon substrate on the effective electric fields in the top and bottom gate oxide. The proposed process technology is suitable for other graphene-based devices such as graphene-based hot electron transistors and photodetectors.

  1. Integrated water system simulation by considering hydrological and biogeochemical processes: model development, with parameter sensitivity and autocalibration

    Science.gov (United States)

    Zhang, Y. Y.; Shao, Q. X.; Ye, A. Z.; Xing, H. T.; Xia, J.

    2016-02-01

    Integrated water system modeling is a feasible approach to understanding severe water crises in the world and promoting the implementation of integrated river basin management. In this study, a classic hydrological model (the time variant gain model: TVGM) was extended to an integrated water system model by coupling multiple water-related processes in hydrology, biogeochemistry, water quality, and ecology, and considering the interference of human activities. A parameter analysis tool, which included sensitivity analysis, autocalibration and model performance evaluation, was developed to improve modeling efficiency. To demonstrate the model performances, the Shaying River catchment, which is the largest highly regulated and heavily polluted tributary of the Huai River basin in China, was selected as the case study area. The model performances were evaluated on the key water-related components including runoff, water quality, diffuse pollution load (or nonpoint sources) and crop yield. Results showed that our proposed model simulated most components reasonably well. The simulated daily runoff at most regulated and less-regulated stations matched well with the observations. The average correlation coefficient and Nash-Sutcliffe efficiency were 0.85 and 0.70, respectively. Both the simulated low and high flows at most stations were improved when the dam regulation was considered. The daily ammonium-nitrogen (NH4-N) concentration was also well captured with the average correlation coefficient of 0.67. Furthermore, the diffuse source load of NH4-N and the corn yield were reasonably simulated at the administrative region scale. This integrated water system model is expected to improve the simulation performances with extension to more model functionalities, and to provide a scientific basis for the implementation in integrated river basin managements.

  2. Effects of stratospheric aerosol surface processes on the LLNL two-dimensional zonally averaged model

    International Nuclear Information System (INIS)

    Connell, P.S.; Kinnison, D.E.; Wuebbles, D.J.; Burley, J.D.; Johnston, H.S.

    1992-01-01

    We have investigated the effects of incorporating representations of heterogeneous chemical processes associated with stratospheric sulfuric acid aerosol into the LLNL two-dimensional, zonally averaged, model of the troposphere and stratosphere. Using distributions of aerosol surface area and volume density derived from SAGE 11 satellite observations, we were primarily interested in changes in partitioning within the Cl- and N- families in the lower stratosphere, compared to a model including only gas phase photochemical reactions

  3. Anatomy as the Backbone of an Integrated First Year Medical Curriculum: Design and Implementation

    Science.gov (United States)

    Klement, Brenda J.; Paulsen, Douglas F.; Wineski, Lawrence E

    2011-01-01

    Morehouse School of Medicine chose to restructure its first year medical curriculum in 2005. The anatomy faculty had prior experience in integrating courses, stemming from the successful integration of individual anatomical sciences courses into a single course called Human Morphology. The integration process was expanded to include the other first year basic science courses (Biochemistry, Physiology, and Neurobiology) as we progressed toward an integrated curriculum. A team, consisting of the course directors, a curriculum coordinator and the Associate Dean for Educational and Faculty Affairs, was assembled to build the new curriculum. For the initial phase, the original course titles were retained but the lecture order was reorganized around the Human Morphology topic sequence. The material from all four courses was organized into four sequential units. Other curricular changes included placing laboratories and lectures more consistently in the daily routine, reducing lecture time from 120 to 90 minute blocks, eliminating unnecessary duplication of content, and increasing the amount of independent study time. Examinations were constructed to include questions from all courses on a single test, reducing the number of examination days in each block from three to one. The entire restructuring process took two years to complete, and the revised curriculum was implemented for the students entering in 2007. The outcomes of the restructured curriculum include a reduction in the number of contact hours by 28%, higher or equivalent subject examination average scores, enhanced student satisfaction, and a first year curriculum team better prepared to move forward with future integration. PMID:21538939

  4. Corrected Integral Shape Averaging Applied to Obstructive Sleep Apnea Detection from the Electrocardiogram

    Directory of Open Access Journals (Sweden)

    C. O'Brien

    2007-01-01

    Full Text Available We present a technique called corrected integral shape averaging (CISA for quantifying shape and shape differences in a set of signals. CISA can be used to account for signal differences which are purely due to affine time warping (jitter and dilation/compression, and hence provide access to intrinsic shape fluctuations. CISA can also be used to define a distance between shapes which has useful mathematical properties; a mean shape signal for a set of signals can be defined, which minimizes the sum of squared shape distances of the set from the mean. The CISA procedure also allows joint estimation of the affine time parameters. Numerical simulations are presented to support the algorithm for obtaining the CISA mean and parameters. Since CISA provides a well-defined shape distance, it can be used in shape clustering applications based on distance measures such as k-means. We present an application in which CISA shape clustering is applied to P-waves extracted from the electrocardiogram of subjects suffering from sleep apnea. The resulting shape clustering distinguishes ECG segments recorded during apnea from those recorded during normal breathing with a sensitivity of 81% and specificity of 84%.

  5. Formalism of continual integrals for cascade processes with particle fusion

    International Nuclear Information System (INIS)

    Gedalin, Eh.V.

    1987-01-01

    Formalism of continuous integrals for description of cascade processes, in which besides cascade particle reproduction, their synthesis and coalescence take place, is used. Account of cascade particle coalescence leads to the fact that the development of some cascade branches cannot be independent and main equations of the cascade process become functional instead of integral. The method of continuous intagrals permits to construct in the closed form producing functionals for the cascade process and to obtain the rules of their calculation using diagrams. Analytical expressions in the form of continuous integrals for producing functionals describing cascade development are obtained

  6. A comprehensive, holistic people integration process for mergers and acquisitions

    Directory of Open Access Journals (Sweden)

    Rina P. Steynberg

    2011-03-01

    Research purpose: To develop and validate a comprehensive, holistic model for the people integration process during mergers and acquisitions. Motivation for the study: The literature on a comprehensive, holistic people integration process for mergers and acquisitions is sparse and fragmented. Research design, approach and method: A qualitative approach was adopted consisting of a three step process which solicited the views of seasoned M&A Practioners; these views were compared against the available literature. Finally, practioners were asked to critique the final model from a practice perspective. The utility of the final model was assessed against two mergers and acquisitions case studies. Main findings: A comprehensive, holistic people integration process model for mergers and acquisitions was developed and validated. However, this model will only significantly enhance mergers and acquisitions value realisation if it is applied from the appropriate vantage point. Practical/managerial implications: The proposed approach will increase the probability of a successful M&A people-wise and M&A value realisation. Contribution/value add: Theoretically, the development and validation of a M&A people process integration model; practically, guidelines for successful people integration; organisationally, significantly enhancing the chances of M&A success; and community wise, the reduction of the negative effects of M&A failure on communities.

  7. Lateral Information Processing by Spiking Neurons: A Theoretical Model of the Neural Correlate of Consciousness

    Directory of Open Access Journals (Sweden)

    Marc Ebner

    2011-01-01

    Full Text Available Cognitive brain functions, for example, sensory perception, motor control and learning, are understood as computation by axonal-dendritic chemical synapses in networks of integrate-and-fire neurons. Cognitive brain functions may occur either consciously or nonconsciously (on “autopilot”. Conscious cognition is marked by gamma synchrony EEG, mediated largely by dendritic-dendritic gap junctions, sideways connections in input/integration layers. Gap-junction-connected neurons define a sub-network within a larger neural network. A theoretical model (the “conscious pilot” suggests that as gap junctions open and close, a gamma-synchronized subnetwork, or zone moves through the brain as an executive agent, converting nonconscious “auto-pilot” cognition to consciousness, and enhancing computation by coherent processing and collective integration. In this study we implemented sideways “gap junctions” in a single-layer artificial neural network to perform figure/ground separation. The set of neurons connected through gap junctions form a reconfigurable resistive grid or sub-network zone. In the model, outgoing spikes are temporally integrated and spatially averaged using the fixed resistive grid set up by neurons of similar function which are connected through gap-junctions. This spatial average, essentially a feedback signal from the neuron's output, determines whether particular gap junctions between neurons will open or close. Neurons connected through open gap junctions synchronize their output spikes. We have tested our gap-junction-defined sub-network in a one-layer neural network on artificial retinal inputs using real-world images. Our system is able to perform figure/ground separation where the laterally connected sub-network of neurons represents a perceived object. Even though we only show results for visual stimuli, our approach should generalize to other modalities. The system demonstrates a moving sub-network zone of

  8. A Multi-Objective Optimization Method to integrate Heat Pumps in Industrial Processes

    OpenAIRE

    Becker, Helen; Spinato, Giulia; Maréchal, François

    2011-01-01

    Aim of process integration methods is to increase the efficiency of industrial processes by using pinch analysis combined with process design methods. In this context, appropriate integrated utilities offer promising opportunities to reduce energy consumption, operating costs and pollutants emissions. Energy integration methods are able to integrate any type of predefined utility, but so far there is no systematic approach to generate potential utilities models based on their technology limit...

  9. Comparison of wintertime CO to NOx ratios to MOVES and MOBILE6.2 on-road emissions inventories

    Science.gov (United States)

    Wallace, H. W.; Jobson, B. T.; Erickson, M. H.; McCoskey, J. K.; VanReken, T. M.; Lamb, B. K.; Vaughan, J. K.; Hardy, R. J.; Cole, J. L.; Strachan, S. M.; Zhang, W.

    2012-12-01

    The CO-to-NOx molar emission ratios from the US EPA vehicle emissions models MOVES and MOBILE6.2 were compared to urban wintertime measurements of CO and NOx. Measurements of CO, NOx, and volatile organic compounds were made at a regional air monitoring site in Boise, Idaho for 2 months from December 2008 to January 2009. The site is impacted by roadway emissions from a nearby busy urban arterial roads and highway. The measured CO-to-NOx ratio for morning rush hour periods was 4.2 ± 0.6. The average CO-to-NOx ratio during weekdays between the hours of 08:00 and 18:00 when vehicle miles travelled were highest was 5.2 ± 0.5. For this time period, MOVES yields an average hourly CO-to-NOx ratio of 9.1 compared to 20.2 for MOBILE6.2. Off-network emissions are a significant fraction of the CO and NOx emissions in MOVES, accounting for 65% of total CO emissions, and significantly increase the CO-to-NOx molar ratio. Observed ratios were more similar to the average hourly running emissions for urban roads determined by MOVES to be 4.3.

  10. Integrated wireless fast-scan cyclic voltammetry recording and electrical stimulation for reward-predictive learning in awake, freely moving rats

    Science.gov (United States)

    Li, Yu-Ting; Wickens, Jeffery R.; Huang, Yi-Ling; Pan, Wynn H. T.; Chen, Fu-Yu Beverly; Chen, Jia-Jin Jason

    2013-08-01

    Objective. Fast-scan cyclic voltammetry (FSCV) is commonly used to monitor phasic dopamine release, which is usually performed using tethered recording and for limited types of animal behavior. It is necessary to design a wireless dopamine sensing system for animal behavior experiments. Approach. This study integrates a wireless FSCV system for monitoring the dopamine signal in the ventral striatum with an electrical stimulator that induces biphasic current to excite dopaminergic neurons in awake freely moving rats. The measured dopamine signals are unidirectionally transmitted from the wireless FSCV module to the host unit. To reduce electrical artifacts, an optocoupler and a separate power are applied to isolate the FSCV system and electrical stimulator, which can be activated by an infrared controller. Main results. In the validation test, the wireless backpack system has similar performance in comparison with a conventional wired system and it does not significantly affect the locomotor activity of the rat. In the cocaine administration test, the maximum electrically elicited dopamine signals increased to around 230% of the initial value 20 min after the injection of 10 mg kg-1 cocaine. In a classical conditioning test, the dopamine signal in response to a cue increased to around 60 nM over 50 successive trials while the electrically evoked dopamine concentration decreased from about 90 to 50 nM in the maintenance phase. In contrast, the cue-evoked dopamine concentration progressively decreased and the electrically evoked dopamine was eliminated during the extinction phase. In the histological evaluation, there was little damage to brain tissue after five months chronic implantation of the stimulating electrode. Significance. We have developed an integrated wireless voltammetry system for measuring dopamine concentration and providing electrical stimulation. The developed wireless FSCV system is proven to be a useful experimental tool for the continuous

  11. Integration Process for the Habitat Demonstration Unit

    Science.gov (United States)

    Gill, Tracy; Merbitz, Jerad; Kennedy, Kriss; Tri, Terry; Howe, A. Scott

    2010-01-01

    The Habitat Demonstration Unit (HDU) is an experimental exploration habitat technology and architecture test platform designed for analog demonstration activities The HDU project has required a team to integrate a variety of contributions from NASA centers and outside collaborators and poses a challenge in integrating these disparate efforts into a cohesive architecture To complete the development of the HDU from conception in June 2009 to rollout for operations in July 2010, a cohesive integration strategy has been developed to integrate the various systems of HDU and the payloads, such as the Geology Lab, that those systems will support The utilization of interface design standards and uniquely tailored reviews have allowed for an accelerated design process Scheduled activities include early fit-checks and the utilization of a Habitat avionics test bed prior to equipment installation into HDU A coordinated effort to utilize modeling and simulation systems has aided in design and integration concept development Modeling tools have been effective in hardware systems layout, cable routing and length estimation, and human factors analysis Decision processes on the shell development including the assembly sequence and the transportation have been fleshed out early on HDU to maximize the efficiency of both integration and field operations Incremental test operations leading up to an integrated systems test allows for an orderly systems test program The HDU will begin its journey as an emulation of a Pressurized Excursion Module (PEM) for 2010 field testing and then may evolve to a Pressurized Core Module (PCM) for 2011 and later field tests, depending on agency architecture decisions The HDU deployment will vary slightly from current lunar architecture plans to include developmental hardware and software items and additional systems called opportunities for technology demonstration One of the HDU challenges has been designing to be prepared for the integration of

  12. Ergonomics Integration Omproving Production Process Management in Enterprises of Latvia

    OpenAIRE

    Henrijs Kaļķis

    2013-01-01

    Dotoral thesis ERGONOMICS INTEGRATION IMPROVING PRODUCTION PROCESS MANAGEMENT IN ENTERPRISES OF LATVIA ANNOTATION Ergonomics integration in process management has great significance in organisations` growth of productivity. It is a new approach to entrepreneurship and business strategy, where ergonomic aspects and values are taken into account in ensuring the effective process management and profitability of enterprises. This study is aimed at solution of the problem of e...

  13. Moving as a gift: relocation in older adulthood.

    Science.gov (United States)

    Perry, Tam E

    2014-12-01

    While discussions of accessibility, mobility and activities of daily living frame relocation studies, in older adulthood, the paper explores the emotional motivation of gift giving as a rationale for moving. This ethnographic study investigates the processes of household disbandment and decision-making of older adults in the Midwestern United States relocating in post-Global Financial Crisis contexts. In this study, relationships are created and sustained through the process of moving, linking older adults (n=81), their kin (n=49), and professionals (n=46) in the Midwestern United States. Using Marcel Mauss' The Gift (1925/1990) as a theoretical lens, relocation in older adulthood is conceptualized as a gift in two ways: to one's partner, and one's kin. Partners may consider gift-giving in terms of the act of moving to appease and honor their partner. Kin who were not moving themselves were also recipients of the gift of moving. These gifts enchain others in relationships of reciprocity. However these gifts, like all gifts, are not without costs or danger, so this paper examines some of the challenges that emerge along with gift-giving. Copyright © 2014 Elsevier Inc. All rights reserved.

  14. Sensory processing patterns predict the integration of information held in visual working memory.

    Science.gov (United States)

    Lowe, Matthew X; Stevenson, Ryan A; Wilson, Kristin E; Ouslis, Natasha E; Barense, Morgan D; Cant, Jonathan S; Ferber, Susanne

    2016-02-01

    Given the limited resources of visual working memory, multiple items may be remembered as an averaged group or ensemble. As a result, local information may be ill-defined, but these ensemble representations provide accurate diagnostics of the natural world by combining gist information with item-level information held in visual working memory. Some neurodevelopmental disorders are characterized by sensory processing profiles that predispose individuals to avoid or seek-out sensory stimulation, fundamentally altering their perceptual experience. Here, we report such processing styles will affect the computation of ensemble statistics in the general population. We identified stable adult sensory processing patterns to demonstrate that individuals with low sensory thresholds who show a greater proclivity to engage in active response strategies to prevent sensory overstimulation are less likely to integrate mean size information across a set of similar items and are therefore more likely to be biased away from the mean size representation of an ensemble display. We therefore propose the study of ensemble processing should extend beyond the statistics of the display, and should also consider the statistics of the observer. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  15. Average-case analysis of numerical problems

    CERN Document Server

    2000-01-01

    The average-case analysis of numerical problems is the counterpart of the more traditional worst-case approach. The analysis of average error and cost leads to new insight on numerical problems as well as to new algorithms. The book provides a survey of results that were mainly obtained during the last 10 years and also contains new results. The problems under consideration include approximation/optimal recovery and numerical integration of univariate and multivariate functions as well as zero-finding and global optimization. Background material, e.g. on reproducing kernel Hilbert spaces and random fields, is provided.

  16. Integrating Islamist Militants into the Political Process : Palestinian ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Integrating Islamist Militants into the Political Process : Palestinian Hamas. The striking victory of Hamas in the elections of January 2006 raises questions about the integration of Islamists into the Palestinian political system. This project, which is part of a larger program of research on the role of political parties in the Middle ...

  17. Crisis and Regional Integration

    DEFF Research Database (Denmark)

    Dosenrode, Søren

    , Tunisia, Egypt …. ), where the crisis referred to could be humanitarian, environmental, economic, political … Europe, too, has also according to mass media, been a victim of a crisis, the financial one. Could ‘crisis’ be a beginning of enhanced regional integration? This paper will try to look...... at the processes of regional integration in relation to ‘crisis’ in Africa and Europe. First, this paper will look at the concept of ‘crisis’, before it moves on to discuss ‘regional integration’ and the correlation between the two, emphasizing the approaches of neo-functionalism and federal theory....... This is the basis for two short case studies of African and European regional integration. The paper tentative answers to the question: will the crisis in Africa and Europe respectively further or block regional integration? With a ‘that depends’. But the use of Federalism theory and neo-functionalism is seen...

  18. Optimizing the Costs of Solid Sorbent-Based CO2 Capture Process Through Heat Integration

    Energy Technology Data Exchange (ETDEWEB)

    Sjostrom, Sharon [Ada-Es, Inc., Highlands Ranch, CO (United States)

    2016-03-18

    The focus of this project was the ADAsorb™ CO2 Capture Process, a temperature-swing adsorption process that incorporates a three-stage fluidized bed as the adsorber and a single-stage fluidized bed as the regenerator. ADAsorb™ system was designed, fabricated, and tested under DOE award DEFE0004343. Two amine-based sorbents were evaluated in conjunction with the ADAsorb™ process: “BN”, an ion-exchange resin; and “OJ”, a metal organic framework (MOF) sorbent. Two cross heat exchanger designs were evaluated for use between the adsorber and regenerator: moving bed and fluidized bed. The fluidized bed approach was rejected fairly early in the project because the additional electrical load to power blowers or fans to overcome the pressure drop required for fluidization was estimated to be nominally three times the electrical power that could be generated from the steam saved through the use of the cross heat exchanger. The Energy Research Center at Lehigh University built and utilized a process model of the ADAsorb™ capture process and integrated this model into an existing model of a supercritical PC power plant. The Lehigh models verified that, for the ADAsorb™ system, the largest contributor to parasitic power was lost electrical generation, which was primarily electric power which the host plant could not generate due to the extraction of low pressure (LP) steam for sorbent heating, followed by power for the CO2 compressor and the blower or fan power required to fluidize the adsorber and regenerator. Sorbent characteristics such as the impacts of moisture uptake, optimized adsorption and regeneration temperature, and sensitivity to changes in pressure were also included in the modeling study. Results indicate that sorbents which adsorb more than 1-2% moisture by weight are unlikely to be cost competitive unless they have an extremely high CO2 working capacity that well exceeds 15% by weight. Modeling also revealed

  19. Historical Data for Average Processing Time Until Hearing Held

    Data.gov (United States)

    Social Security Administration — This dataset provides historical data for average wait time (in days) from the hearing request date until a hearing was held. This dataset includes data from fiscal...

  20. Let’s move our health! The experience of 40 physical activity motivational workshops

    Science.gov (United States)

    Bouté, Catherine; Cailliez, Elisabeth; D Hour, Alain; Goxe, Didier; Gusto, Gaëlle; Copin, Nane; Lantieri, Olivier

    2016-10-19

    Aims: To set up physical activity promotion workshops in health centres to help people with a sedentary lifestyle achieve an adequate level of physical activity. Methods: This health programme, called ‘Bougeons Notre Santé’ (Let’s move our health) has been implemented since 2006 by four health centres in the Pays de la Loire region, in France. This article describes implementation of the programme, its feasibility, how it can be integrated into a global preventive approach and its outcomes on promoting more physical activity. The “Let’s move our health!” programme comprises four group meetings with participants over a period of several months. At these meetings, participants discuss, exchange and monitor their qualitative and quantitative level of physical activity. Realistic and achievable goals are set in consultation with each participant in relation to their personal circumstances and are monitored with a pedometer and a follow-up diary. Support on healthy eating is also provided. This programme is an opportunity to promote health and refer participants to existing local resources. Results: Forty groups, comprising a total of 275 people, have participated in the programme since 2006. After the four meetings, participants had increased their physical activity level by an average of 723 steps per day and 85% reported that they had changed their eating habits. Conclusion: This health promotion programme is feasible and effective: an increase in the physical activity of participants was observed, together with a favourable impact on perceived health, well-being and social links. These workshops are integrated into a network of associations and institutional partners and could be implemented by similar social or health organisations.

  1. An Innovative VHTR Waste Heat Integration with Forward Osmosis Desalination Process

    Energy Technology Data Exchange (ETDEWEB)

    Park, Min Young; Kim, Eung Soo [Seoul National Univ., Seoul (Korea, Republic of)

    2013-10-15

    The integration concept implies the coupling of the waste heat from VHTR with the draw solute recovery system of FO process. By integrating these two novel technologies, advantages, such as improvement of total energy utilization, and production of fresh water using waste heat, can be achieved. In order to thermodynamically analyze the integrated system, the FO process and power conversion system of VHTR are simulated using chemical process software UNISIM together with OLI property package. In this study, the thermodynamic analysis on the VHTR and FO integrated system has been carried out to assess the feasibility of the concept. The FO process including draw solute recovery system is calculated to have a higher GOR compared to the MSF and MED when reasonable FO performance can be promised. Furthermore, when FO process is integrated with the VHTR to produce potable water from waste heat, it still shows a comparable GOR to typical GOR values of MSF and MED. And the waste heat utilization is significantly higher in FO than in MED and MSF. This results in much higher water production when integrated to the same VHTR plant. Therefore, it can be concluded that the suggested integrated system of VHTR and FO is a very promising and strong system concept which has a number of advantages over conventional technologies.

  2. Collaboration process for integrated social and health care strategy implementation

    Directory of Open Access Journals (Sweden)

    Jukka Korpela

    2012-05-01

    Full Text Available Objective:  To present collaboration process for creating a roadmap for the implementation of a strategy for integrated health and social care. The developed collaboration process includes multiple phases and uses electronic group decision support system technology (GDSS.Method: A case study done in the South Karelia District of Social and Health Services in Finland during 2010 - 2011. An expert panel of 13 participants was used in the planning process of the strategy implementation. The participants were interviewed and observed during the case study.Results: As a practical result, a roadmap for integrated health and social care strategy implementation has been developed. The strategic roadmap includes detailed plans of several projects which are needed for successful integration strategy implementation. As an academic result, a collaboration process to create such a roadmap has been developed.Conclusions: The collaboration process and technology seem to suit the planning process well. The participants of the meetings were satisfied with the collaboration process and the GDSS technology. The strategic roadmap was accepted by the participants, which indicates satisfaction with the developed process.

  3. Collaboration process for integrated social and health care strategy implementation.

    Science.gov (United States)

    Korpela, Jukka; Elfvengren, Kalle; Kaarna, Tanja; Tepponen, Merja; Tuominen, Markku

    2012-01-01

    To present a collaboration process for creating a roadmap for the implementation of a strategy for integrated health and social care. The developed collaboration process includes multiple phases and uses electronic group decision support system technology (GDSS). A case study done in the South Karelia District of Social and Health Services in Finland during 2010-2011. An expert panel of 13 participants was used in the planning process of the strategy implementation. The participants were interviewed and observed during the case study. As a practical result, a roadmap for integrated health and social care strategy implementation has been developed. The strategic roadmap includes detailed plans of several projects which are needed for successful integration strategy implementation. As an academic result, a collaboration process to create such a roadmap has been developed. The collaboration process and technology seem to suit the planning process well. The participants of the meetings were satisfied with the collaboration process and the GDSS technology. The strategic roadmap was accepted by the participants, which indicates satisfaction with the developed process.

  4. Testing for Level Shifts in Fractionally Integrated Processes: a State Space Approach

    DEFF Research Database (Denmark)

    Monache, Davide Delle; Grassi, Stefano; Santucci de Magistris, Paolo

    Short memory models contaminated by level shifts have similar long-memory features as fractionally integrated processes. This makes hard to verify whether the true data generating process is a pure fractionally integrated process when employing standard estimation methods based on the autocorrela......Short memory models contaminated by level shifts have similar long-memory features as fractionally integrated processes. This makes hard to verify whether the true data generating process is a pure fractionally integrated process when employing standard estimation methods based...... on the autocorrelation function or the periodogram. In this paper, we propose a robust testing procedure, based on an encompassing parametric specification that allows to disentangle the level shifts from the fractionally integrated component. The estimation is carried out on the basis of a state-space methodology...... and it leads to a robust estimate of the fractional integration parameter also in presence of level shifts. Once the memory parameter is correctly estimated, we use the KPSS test for presence of level shift. The Monte Carlo simulations show how this approach produces unbiased estimates of the memory parameter...

  5. Integrating personality structure, personality process, and personality development

    NARCIS (Netherlands)

    Baumert, Anna; Schmitt, Manfred; Perugini, Marco; Johnson, Wendy; Blum, Gabriela; Borkenau, Peter; Costantini, Giulio; Denissen, J.J.A.; Fleeson, William; Grafton, Ben; Jayawickreme, Eranda; Kurzius, Elena; MacLeod, Colin; Miller, Lynn C.; Read, Stephen J.; Robinson, Michael D.; Wood, Dustin; Wrzus, Cornelia

    2017-01-01

    In this target article, we argue that personality processes, personality structure, and personality development have to be understood and investigated in integrated ways in order to provide comprehensive responses to the key questions of personality psychology. The psychological processes and

  6. The Kaldnes Moving Bed biofilm technology for treatment of industrial wastewater; Tecnologia Kaldnes Moving Bed biofilm (KMT) para la depuracion de aguas residuales industriales

    Energy Technology Data Exchange (ETDEWEB)

    Rasmussen, V.; Garcia Carrion, M.; Farre Solsona, C.

    2004-07-01

    The Kaldnes Moving bed biofilm technology is a biofilm process which is very suitable for treatment of industrial wastewaters. Biofilm processes have several acknowledged advantages compared to suspended biomass processes, e. g. resistance to toxicity and load variations. Traditionally biofilm processes have been known to clog at high loads and hence have not been suited for industrial effluents: however, the Kaldnes Moving Bed biofilm process has overcome this problem. This article describes how the process has been used as pre-treatment up front of activated sludge at a dairy in USA, and as sole treatment at pharmaceutical industry in Sweden. (Author)

  7. Simplification of Process Integration Studies in Intermediate Size Industries

    DEFF Research Database (Denmark)

    Dalsgård, Henrik; Petersen, P. M.; Qvale, Einar Bjørn

    2002-01-01

    associated with a given process integration study in an intermediate size industry. This is based on the observation that the systems that eventually result from a process integration project and that are economically and operationally most interesting are also quite simple. Four steps that may be used......It can be argued that the largest potential for energy savings based on process integration is in the intermediate size industry. But this is also the industrial scale in which it is most difficult to make the introduction of energy saving measures economically interesting. The reasons......' and therefore lead to non-optimal economic solutions, which may be right. But the objective of the optimisation is not to reach the best economic solution, but to relatively quickly develop the design of a simple and operationally friendly network without losing too much energy saving potential. (C) 2002...

  8. Process-oriented integration and coordination of healthcare services across organizational boundaries.

    Science.gov (United States)

    Tello-Leal, Edgar; Chiotti, Omar; Villarreal, Pablo David

    2012-12-01

    The paper presents a methodology that follows a top-down approach based on a Model-Driven Architecture for integrating and coordinating healthcare services through cross-organizational processes to enable organizations providing high quality healthcare services and continuous process improvements. The methodology provides a modeling language that enables organizations conceptualizing an integration agreement, and identifying and designing cross-organizational process models. These models are used for the automatic generation of: the private view of processes each organization should perform to fulfill its role in cross-organizational processes, and Colored Petri Net specifications to implement these processes. A multi-agent system platform provides agents able to interpret Colored Petri-Nets to enable the communication between the Healthcare Information Systems for executing the cross-organizational processes. Clinical documents are defined using the HL7 Clinical Document Architecture. This methodology guarantees that important requirements for healthcare services integration and coordination are fulfilled: interoperability between heterogeneous Healthcare Information Systems; ability to cope with changes in cross-organizational processes; guarantee of alignment between the integrated healthcare service solution defined at the organizational level and the solution defined at technological level; and the distributed execution of cross-organizational processes keeping the organizations autonomy.

  9. Integration Processes of Delay Differential Equation Based on Modified Laguerre Functions

    Directory of Open Access Journals (Sweden)

    Yeguo Sun

    2012-01-01

    Full Text Available We propose long-time convergent numerical integration processes for delay differential equations. We first construct an integration process based on modified Laguerre functions. Then we establish its global convergence in certain weighted Sobolev space. The proposed numerical integration processes can also be used for systems of delay differential equations. We also developed a technique for refinement of modified Laguerre-Radau interpolations. Lastly, numerical results demonstrate the spectral accuracy of the proposed method and coincide well with analysis.

  10. Materials and Process Design for High-Temperature Carburizing: Integrating Processing and Performance

    Energy Technology Data Exchange (ETDEWEB)

    D. Apelian

    2007-07-23

    The objective of the project is to develop an integrated process for fast, high-temperature carburizing. The new process results in an order of magnitude reduction in cycle time compared to conventional carburizing and represents significant energy savings in addition to a corresponding reduction of scrap associated with distortion free carburizing steels.

  11. Ra and the average effective strain of surface asperities deformed in metal-working processes

    DEFF Research Database (Denmark)

    Bay, Niels; Wanheim, Tarras; Petersen, A. S

    1975-01-01

    Based upon a slip-line analysis of the plastic deformation of surface asperities, a theory is developed determining the Ra-value (c.l.a.) and the average effective strain in the surface layer when deforming asperities in metal-working processes. The ratio between Ra and Ra0, the Ra-value after...... and before deformation, is a function of the nominal normal pressure and the initial slope γ0 of the surface asperities. The last parameter does not influence Ra significantly. The average effective strain View the MathML sourcege in the deformed surface layer is a function of the nominal normal pressure...... and γ0. View the MathML sourcege is highly dependent on γ0, View the MathML sourcege increasing with increasing γ0. It is shown that the Ra-value and the strain are hardly affected by the normal pressure until interacting deformation of the asperities begins, that is until the limit of Amonton's law...

  12. Moving In, Moving Through, and Moving Out: The Transitional Experiences of Foster Youth College Students

    Science.gov (United States)

    Gamez, Sara I.

    2017-01-01

    The purpose of this qualitative study was to explore the transitional experiences of foster youth college students. The study explored how foster youth experienced moving into, moving through, and moving out of the college environment and what resources and strategies they used to thrive during their college transitions. In addition, this study…

  13. 49 CFR 1106.4 - The Safety Integration Plan process.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 8 2010-10-01 2010-10-01 false The Safety Integration Plan process. 1106.4 Section 1106.4 Transportation Other Regulations Relating to Transportation (Continued) SURFACE... CONSIDERATION OF SAFETY INTEGRATION PLANS IN CASES INVOLVING RAILROAD CONSOLIDATIONS, MERGERS, AND ACQUISITIONS...

  14. Integrating conceptualizations of experience into the interaction design process

    DEFF Research Database (Denmark)

    Dalsgaard, Peter

    2010-01-01

    From a design perspective, the increasing awareness of experiential aspects of interactive systems prompts the question of how conceptualizations of experience can inform and potentially be integrated into the interaction design process. This paper presents one approach to integrating theoretical...

  15. Integrated Process Modeling-A Process Validation Life Cycle Companion.

    Science.gov (United States)

    Zahel, Thomas; Hauer, Stefan; Mueller, Eric M; Murphy, Patrick; Abad, Sandra; Vasilieva, Elena; Maurer, Daniel; Brocard, Cécile; Reinisch, Daniela; Sagmeister, Patrick; Herwig, Christoph

    2017-10-17

    During the regulatory requested process validation of pharmaceutical manufacturing processes, companies aim to identify, control, and continuously monitor process variation and its impact on critical quality attributes (CQAs) of the final product. It is difficult to directly connect the impact of single process parameters (PPs) to final product CQAs, especially in biopharmaceutical process development and production, where multiple unit operations are stacked together and interact with each other. Therefore, we want to present the application of Monte Carlo (MC) simulation using an integrated process model (IPM) that enables estimation of process capability even in early stages of process validation. Once the IPM is established, its capability in risk and criticality assessment is furthermore demonstrated. IPMs can be used to enable holistic production control strategies that take interactions of process parameters of multiple unit operations into account. Moreover, IPMs can be trained with development data, refined with qualification runs, and maintained with routine manufacturing data which underlines the lifecycle concept. These applications will be shown by means of a process characterization study recently conducted at a world-leading contract manufacturing organization (CMO). The new IPM methodology therefore allows anticipation of out of specification (OOS) events, identify critical process parameters, and take risk-based decisions on counteractions that increase process robustness and decrease the likelihood of OOS events.

  16. Dynamic Complexity Study of Nuclear Reactor and Process Heat Application Integration

    International Nuclear Information System (INIS)

    Taylor, J'Tia Patrice; Shropshire, David E.

    2009-01-01

    This paper describes the key obstacles and challenges facing the integration of nuclear reactors with process heat applications as they relate to dynamic issues. The paper also presents capabilities of current modeling and analysis tools available to investigate these issues. A pragmatic approach to an analysis is developed with the ultimate objective of improving the viability of nuclear energy as a heat source for process industries. The extension of nuclear energy to process heat industries would improve energy security and aid in reduction of carbon emissions by reducing demands for foreign derived fossil fuels. The paper begins with an overview of nuclear reactors and process application for potential use in an integrated system. Reactors are evaluated against specific characteristics that determine their compatibility with process applications such as heat outlet temperature. The reactor system categories include light water, heavy water, small to medium, near term high-temperature, and far term high temperature reactors. Low temperature process systems include desalination, district heating, and tar sands and shale oil recovery. High temperature processes that support hydrogen production include steam reforming, steam cracking, hydrogen production by electrolysis, and far-term applications such as the sulfur iodine chemical process and high-temperature electrolysis. A simple static matching between complementary systems is performed; however, to gain a true appreciation for system integration complexity, time dependent dynamic analysis is required. The paper identifies critical issues arising from dynamic complexity associated with integration of systems. Operational issues include scheduling conflicts and resource allocation for heat and electricity. Additionally, economic and safety considerations that could impact the successful integration of these systems are considered. Economic issues include the cost differential arising due to an integrated system

  17. Integrated reliability condition monitoring and maintenance of equipment

    CERN Document Server

    Osarenren, John

    2015-01-01

    Consider a Viable and Cost-Effective Platform for the Industries of the Future (IOF) Benefit from improved safety, performance, and product deliveries to your customers. Achieve a higher rate of equipment availability, performance, product quality, and reliability. Integrated Reliability: Condition Monitoring and Maintenance of Equipment incorporates reliable engineering and mathematical modeling to help you move toward sustainable development in reliability condition monitoring and maintenance. This text introduces a cost-effective integrated reliability growth monitor, integrated reliability degradation monitor, technological inheritance coefficient sensors, and a maintenance tool that supplies real-time information for predicting and preventing potential failures of manufacturing processes and equipment. The author highlights five key elements that are essential to any improvement program: improving overall equipment and part effectiveness, quality, and reliability; improving process performance with maint...

  18. Development of Probabilistic Structural Analysis Integrated with Manufacturing Processes

    Science.gov (United States)

    Pai, Shantaram S.; Nagpal, Vinod K.

    2007-01-01

    An effort has been initiated to integrate manufacturing process simulations with probabilistic structural analyses in order to capture the important impacts of manufacturing uncertainties on component stress levels and life. Two physics-based manufacturing process models (one for powdered metal forging and the other for annular deformation resistance welding) have been linked to the NESSUS structural analysis code. This paper describes the methodology developed to perform this integration including several examples. Although this effort is still underway, particularly for full integration of a probabilistic analysis, the progress to date has been encouraging and a software interface that implements the methodology has been developed. The purpose of this paper is to report this preliminary development.

  19. Ontological Analysis of Integrated Process Models: testing hypotheses

    Directory of Open Access Journals (Sweden)

    Michael Rosemann

    2001-11-01

    Full Text Available Integrated process modeling is achieving prominence in helping to document and manage business administration and IT processes in organizations. The ARIS framework is a popular example for a framework of integrated process modeling not least because it underlies the 800 or more reference models embedded in the world's most popular ERP package, SAP R/3. This paper demonstrates the usefulness of the Bunge-Wand-Weber (BWW representation model for evaluating modeling grammars such as those constituting ARIS. It reports some initial insights gained from pilot testing Green and Rosemann's (2000 evaluative propositions. Even when considering all five views of ARIS, modelers have problems representing business rules, the scope and boundary of systems, and decomposing models. However, even though it is completely ontologically redundant, users still find the function view useful in modeling.

  20. Total site integration of light hydrocarbons separation process

    OpenAIRE

    Ulyev, L.; Vasilyev, M.; Maatouk, A.; Duic, Neven; Khusanovc, Alisher

    2016-01-01

    Ukraine is the largest consumer of hydrocarbons per unit of production in Europe (Ukraine policy review, 2006). The most important point is a reduction of energy consumption in chemical and metallurgical industries as a biggest consumer. This paper deals with energy savings potential of light hydrocarbons separation process. Energy consumption of light hydrocarbons separation process processes typical of Eastern European countries were analysed. Process Integration (PI) was used to perform a ...

  1. Developing a comprehensive measure of mobility: mobility over varied environments scale (MOVES).

    Science.gov (United States)

    Hirsch, Jana A; Winters, Meghan; Sims-Gould, Joanie; Clarke, Philippa J; Ste-Marie, Nathalie; Ashe, Maureen; McKay, Heather A

    2017-05-25

    While recent work emphasizes the multi-dimensionality of mobility, no current measure incorporates multiple domains of mobility. Using existing conceptual frameworks we identified four domains of mobility (physical, cognitive, social, transportation) to create a "Mobility Over Varied Environments Scale" (MOVES). We then assessed expected patterns of MOVES in the Canadian population. An expert panel identified survey items within each MOVES domain from the Canadian Community Health Survey- Healthy Aging Cycle (2008-2009) for 28,555 (weighted population n = 12,805,067) adults (≥45 years). We refined MOVES using principal components analysis and Cronbach's alpha and weighted items so each domain was 10 points. Expected mobility trends, as assessed by average MOVES, were examined by sociodemographic and health factors, and by province, using Analysis of Variance (ANOVA). MOVES ranged from 0 to 40, where 0 represents individuals who are immobile and 40 those who are fully mobile. Mean MOVES was 29.58 (95% confidence interval (CI) 29.49, 29.67) (10th percentile: 24.17 (95% CI 23.96, 24.38), 90th percentile: 34.70 (CI 34.55, 34.85)). MOVES scores were lower for older, female, and non-white Canadians with worse health and lower socioeconomic status. MOVES was also lower for those who live in less urban areas. MOVES is a holistic measure of mobility for characterizing older adult mobility across populations. Future work should examine individual or neighborhood predictors of MOVES and its relationship to broader health outcomes. MOVES holds utility for research, surveillance, evaluation, and interventions around the broad factors influencing mobility in older adults.

  2. Unconscious learning processes: mental integration of verbal and pictorial instructional materials.

    Science.gov (United States)

    Kuldas, Seffetullah; Ismail, Hairul Nizam; Hashim, Shahabuddin; Bakar, Zainudin Abu

    2013-12-01

    This review aims to provide an insight into human learning processes by examining the role of cognitive and emotional unconscious processing in mentally integrating visual and verbal instructional materials. Reviewed literature shows that conscious mental integration does not happen all the time, nor does it necessarily result in optimal learning. Students of all ages and levels of experience cannot always have conscious awareness, control, and the intention to learn or promptly and continually organize perceptual, cognitive, and emotional processes of learning. This review suggests considering the role of unconscious learning processes to enhance the understanding of how students form or activate mental associations between verbal and pictorial information. The understanding would assist in presenting students with spatially-integrated verbal and pictorial instructional materials as a way of facilitating mental integration and improving teaching and learning performance.

  3. Random walk of passive tracers among randomly moving obstacles.

    Science.gov (United States)

    Gori, Matteo; Donato, Irene; Floriani, Elena; Nardecchia, Ilaria; Pettini, Marco

    2016-04-14

    This study is mainly motivated by the need of understanding how the diffusion behavior of a biomolecule (or even of a larger object) is affected by other moving macromolecules, organelles, and so on, inside a living cell, whence the possibility of understanding whether or not a randomly walking biomolecule is also subject to a long-range force field driving it to its target. By means of the Continuous Time Random Walk (CTRW) technique the topic of random walk in random environment is here considered in the case of a passively diffusing particle among randomly moving and interacting obstacles. The relevant physical quantity which is worked out is the diffusion coefficient of the passive tracer which is computed as a function of the average inter-obstacles distance. The results reported here suggest that if a biomolecule, let us call it a test molecule, moves towards its target in the presence of other independently interacting molecules, its motion can be considerably slowed down.

  4. Research on moving object detection based on frog's eyes

    Science.gov (United States)

    Fu, Hongwei; Li, Dongguang; Zhang, Xinyuan

    2008-12-01

    On the basis of object's information processing mechanism with frog's eyes, this paper discussed a bionic detection technology which suitable for object's information processing based on frog's vision. First, the bionics detection theory by imitating frog vision is established, it is an parallel processing mechanism which including pick-up and pretreatment of object's information, parallel separating of digital image, parallel processing, and information synthesis. The computer vision detection system is described to detect moving objects which has special color, special shape, the experiment indicates that it can scheme out the detecting result in the certain interfered background can be detected. A moving objects detection electro-model by imitating biologic vision based on frog's eyes is established, the video simulative signal is digital firstly in this system, then the digital signal is parallel separated by FPGA. IN the parallel processing, the video information can be caught, processed and displayed in the same time, the information fusion is taken by DSP HPI ports, in order to transmit the data which processed by DSP. This system can watch the bigger visual field and get higher image resolution than ordinary monitor systems. In summary, simulative experiments for edge detection of moving object with canny algorithm based on this system indicate that this system can detect the edge of moving objects in real time, the feasibility of bionic model was fully demonstrated in the engineering system, and it laid a solid foundation for the future study of detection technology by imitating biologic vision.

  5. Effect of contact angle hysteresis on moving liquid film integrity.

    Science.gov (United States)

    Simon, F. F.; Hsu, Y. Y.

    1972-01-01

    A study was made of the formation and breakdown of a water film moving over solid surfaces (teflon, lucite, stainless steel, and copper). The flow rate associated with film formation was found to be higher than the flow rate at which film breakdown occurred. The difference in the flow rates for film formation and film breakdown was attributed to contact angle hysteresis. Analysis and experiment, which are in good agreement, indicated that film formation and film breakdown are functions of the advancing and receding angles, respectively.

  6. Moving interfacial crack between two dissimilar soft ferromagnetic materials in uniform magnetic field

    International Nuclear Information System (INIS)

    Zhao, She Xu; Lee, Kang Yong

    2007-01-01

    This paper presents the dynamic magnetoelastic stress intensity factors of a Yoffe-type moving crack at the interface between two dissimilar soft ferromagnetic elastic half-planes. The solids are subjected to a uniform in-plane magnetic field and the crack is opened by internal normal and shear tractions. The problem is considered within the framework of linear magnetoelasticity. By application of the Fourier integral transform, the mixed boundary problem is reduced to a pair of integral equations of the second kind with Cauchy-type singularities. The singular integral equations are solved by means of a Jacobi polynomial expansion method. For a particular case, closed-form solutions are obtained. It is shown that the magnetoelastic stress intensity factors depend on the moving velocity of the crack, the magnetic field and the magnetoelastic properties of the materials

  7. Moving code - Sharing geoprocessing logic on the Web

    Science.gov (United States)

    Müller, Matthias; Bernard, Lars; Kadner, Daniel

    2013-09-01

    Efficient data processing is a long-standing challenge in remote sensing. Effective and efficient algorithms are required for product generation in ground processing systems, event-based or on-demand analysis, environmental monitoring, and data mining. Furthermore, the increasing number of survey missions and the exponentially growing data volume in recent years have created demand for better software reuse as well as an efficient use of scalable processing infrastructures. Solutions that address both demands simultaneously have begun to slowly appear, but they seldom consider the possibility to coordinate development and maintenance efforts across different institutions, community projects, and software vendors. This paper presents a new approach to share, reuse, and possibly standardise geoprocessing logic in the field of remote sensing. Drawing from the principles of service-oriented design and distributed processing, this paper introduces moving-code packages as self-describing software components that contain algorithmic code and machine-readable descriptions of the provided functionality, platform, and infrastructure, as well as basic information about exploitation rights. Furthermore, the paper presents a lean publishing mechanism by which to distribute these packages on the Web and to integrate them in different processing environments ranging from monolithic workstations to elastic computational environments or "clouds". The paper concludes with an outlook toward community repositories for reusable geoprocessing logic and their possible impact on data-driven science in general.

  8. Effect of different machining processes on the tool surface integrity and fatigue life

    Energy Technology Data Exchange (ETDEWEB)

    Cao, Chuan Liang [College of Mechanical and Electrical Engineering, Nanchang University, Nanchang (China); Zhang, Xianglin [School of Materials Science and Engineering, Huazhong University of Science and Technology, Wuhan (China)

    2016-08-15

    Ultra-precision grinding, wire-cut electro discharge machining and lapping are often used to machine the tools in fine blanking industry. And the surface integrity from these machining processes causes great concerns in the research field. To study the effect of processing surface integrity on the fine blanking tool life, the surface integrity of different tool materials under different processing conditions and its influence on fatigue life were thoroughly analyzed in the present study. The result shows that the surface integrity of different materials was quite different on the same processing condition. For the same tool material, the surface integrity on varying processing conditions was quite different too and deeply influenced the fatigue life.

  9. Determining average path length and average trapping time on generalized dual dendrimer

    Science.gov (United States)

    Li, Ling; Guan, Jihong

    2015-03-01

    Dendrimer has wide number of important applications in various fields. In some cases during transport or diffusion process, it transforms into its dual structure named Husimi cactus. In this paper, we study the structure properties and trapping problem on a family of generalized dual dendrimer with arbitrary coordination numbers. We first calculate exactly the average path length (APL) of the networks. The APL increases logarithmically with the network size, indicating that the networks exhibit a small-world effect. Then we determine the average trapping time (ATT) of the trapping process in two cases, i.e., the trap placed on a central node and the trap is uniformly distributed in all the nodes of the network. In both case, we obtain explicit solutions of ATT and show how they vary with the networks size. Besides, we also discuss the influence of the coordination number on trapping efficiency.

  10. Visual SLAM and Moving-object Detection for a Small-size Humanoid Robot

    Directory of Open Access Journals (Sweden)

    Yin-Tien Wang

    2010-09-01

    Full Text Available In the paper, a novel moving object detection (MOD algorithm is developed and integrated with robot visual Simultaneous Localization and Mapping (vSLAM. The moving object is assumed to be a rigid body and its coordinate system in space is represented by a position vector and a rotation matrix. The MOD algorithm is composed of detection of image features, initialization of image features, and calculation of object coordinates. Experimentation is implemented on a small-size humanoid robot and the results show that the performance of the proposed algorithm is efficient for robot visual SLAM and moving object detection.

  11. Process-integrated slag treatment; Prozessintegrierte Schlackebehandlung

    Energy Technology Data Exchange (ETDEWEB)

    Koralewska, R.; Faulstich, M. [Technische Univ., Garching (Germany). Lehrstuhl fuer Wasserguete- und Abfallwirtschaft

    1998-09-01

    The present study compares two methods of washing waste incineration slag, one with water only, and one which uses additives during wet deslagging. The presented aggregate offers ideal conditions for process-integrated slag treatment. The paper gives a schematic description of the integrated slag washing process. The washing liquid serves to wash out the readily soluble constituents and remove the fines, while the additives are for immobilising heavy metals in the slag material. The study is based on laboratory and semi-technical trials on the wet chemical treatment of grate slag with addition of carbon dioxide and phosphoric acid. [Deutsch] Die dargestellten Untersuchungen beziehen sich auf den Vergleich zwischen einer Waesche der Muellverbrennungsschlacke mit Wasser und unter Zugabe von Additiven im Nassentschlacker. In diesem Aggregat bieten sich optimale Voraussetzungen fuer eine prozessintegrierte Schlackebehandlung. Die Durchfuehrung der integrierten Schlackewaesche wird schematisch gezeigt. Durch die Waschfluessigkeit sollen die leichtloeslichen Bestandteile ausgewaschen und die Feinanteile ausgetragen sowie durch die Additive zusaetzlich die Schwermetalle im Schlackematerial immobilisiert werden. Dazu erfolgten Labor- und halbtechnische Versuche zur nasschemischen Behandlung der Rostschlacken unter Zugabe von Kohlendioxid und Phosphorsaeure. (orig./SR)

  12. Manufacturing Process for OLED Integrated Substrate

    Energy Technology Data Exchange (ETDEWEB)

    Hung, Cheng-Hung [Vitro Flat Glass LLC, Cheswick, PA (United States). Glass Technology Center

    2017-03-31

    The main objective of this project was to develop a low-cost integrated substrate for rigid OLED solid-state lighting produced at a manufacturing scale. The integrated substrates could include combinations of soda lime glass substrate, light extraction layer, and an anode layer (i.e., Transparent Conductive Oxide, TCO). Over the 3+ year course of the project, the scope of work was revised to focus on the development of a glass substrates with an internal light extraction (IEL) layer. A manufacturing-scale float glass on-line particle embedding process capable of producing an IEL glass substrate having a thickness of less than 1.7mm and an area larger than 500mm x 400mm was demonstrated. Substrates measuring 470mm x 370mm were used in the OLED manufacturing process for fabricating OLED lighting panels in single pixel devices as large as 120.5mm x 120.5mm. The measured light extraction efficiency (calculated as external quantum efficiency, EQE) for on-line produced IEL samples (>50%) met the project’s initial goal.

  13. Analysis and prediction of daily physical activity level data using autoregressive integrated moving average models

    NARCIS (Netherlands)

    Long, Xi; Pauws, S.C.; Pijl, M.; Lacroix, J.; Goris, A.H.C.; Aarts, R.M.

    2009-01-01

    Results are provided on predicting daily physical activity level (PAL) data from past data of participants of a physical activity lifestyle program aimed at promoting a healthier lifestyle consisting of more physical exercise. The PAL data quantifies the level of a person’s daily physical activity

  14. CMOS and BiCMOS process integration and device characterization

    CERN Document Server

    El-Kareh, Badih

    2009-01-01

    Covers both the theoretical and practical aspects of modern silicon devices and the relationship between their electrical properties and processing conditions. This book also covers silicon devices and integrated process technologies. It discusses modern silicon devices, their characteristics, and interactions with process parameters.

  15. Software features and applications in process design, integration and operation

    Energy Technology Data Exchange (ETDEWEB)

    Dhole, V. [Aspen Tech Limited, Warrington (United Kingdom)

    1999-02-01

    Process engineering technologies and tools have evolved rapidly over the last twenty years. Process simulation/modeling, advanced process control, on-line optimisation, production planning and supply chain management are some of the examples of technologies that have rapidly matured from early commercial prototypes and concepts to established tools with significant impact on profitability of process industry today. Process Synthesis or Process Integration (PI) in comparison is yet to create its impact and still remains largely in the domain of few expert users. One of the key reasons as to why PI has not taken off is because the PI tools have not become integral components of the standard process engineering environments. On the last 15 years AspenTech has grown from a small process simulation tool provider to a large multinational company providing a complete suite of process engineering technologies and services covering process design, operation, planning and supply chain management. Throughout this period, AspenTech has acquired experience in rapidly evolving technologies from their early prototype stage to mature products and services. The paper outlines AspenTech`s strategy of integrating PI with other more established process design and operational improvement technologies. The paper illustrates the key elements of AspenTech`s strategy via examples of software development initiatives and services projects. The paper also outlines AspenTech`s future vision of the role of PI in process engineering. (au)

  16. 9 CFR 316.8 - Unmarked inspected products; moved between official establishments; moved in commerce.

    Science.gov (United States)

    2010-01-01

    ... between official establishments; moved in commerce. 316.8 Section 316.8 Animals and Animal Products FOOD... establishment to another official establishment, for further processing, in a railroad car, truck, or other closed container, if the railroad car, truck, or container is sealed with an official seal of the...

  17. Integrated modelling of near field and engineered barrier system processes

    International Nuclear Information System (INIS)

    Lamont, A.; Gansemer, J.

    1994-01-01

    The Yucca Mountain Integrating Model (YMIM) is an integrated model of the Engineered barrier System has been developed to assist project managers at LLNL in identifying areas where research emphasis should be placed. The model was designed to be highly modular so that a model of an individual process could be easily modified or replaced without interfering with the models of other processes. The modules modelling container failure and the dissolution of nuclides include particularly detailed, temperature dependent models of their corresponding processes

  18. Integrating deep and shallow natural language processing components : representations and hybrid architectures

    OpenAIRE

    Schäfer, Ulrich

    2006-01-01

    We describe basic concepts and software architectures for the integration of shallow and deep (linguistics-based, semantics-oriented) natural language processing (NLP) components. The main goal of this novel, hybrid integration paradigm is improving robustness of deep processing. After an introduction to constraint-based natural language parsing, we give an overview of typical shallow processing tasks. We introduce XML standoff markup as an additional abstraction layer that eases integration ...

  19. Tracking 3D Moving Objects Based on GPS/IMU Navigation Solution, Laser Scanner Point Cloud and GIS Data

    Directory of Open Access Journals (Sweden)

    Siavash Hosseinyalamdary

    2015-07-01

    Full Text Available Monitoring vehicular road traffic is a key component of any autonomous driving platform. Detecting moving objects, and tracking them, is crucial to navigating around objects and predicting their locations and trajectories. Laser sensors provide an excellent observation of the area around vehicles, but the point cloud of objects may be noisy, occluded, and prone to different errors. Consequently, object tracking is an open problem, especially for low-quality point clouds. This paper describes a pipeline to integrate various sensor data and prior information, such as a Geospatial Information System (GIS map, to segment and track moving objects in a scene. We show that even a low-quality GIS map, such as OpenStreetMap (OSM, can improve the tracking accuracy, as well as decrease processing time. A bank of Kalman filters is used to track moving objects in a scene. In addition, we apply non-holonomic constraint to provide a better orientation estimation of moving objects. The results show that moving objects can be correctly detected, and accurately tracked, over time, based on modest quality Light Detection And Ranging (LiDAR data, a coarse GIS map, and a fairly accurate Global Positioning System (GPS and Inertial Measurement Unit (IMU navigation solution.

  20. Consistent and efficient processing of ADCP streamflow measurements

    Science.gov (United States)

    Mueller, David S.; Constantinescu, George; Garcia, Marcelo H.; Hanes, Dan

    2016-01-01

    The use of Acoustic Doppler Current Profilers (ADCPs) from a moving boat is a commonly used method for measuring streamflow. Currently, the algorithms used to compute the average depth, compute edge discharge, identify invalid data, and estimate velocity and discharge for invalid data vary among manufacturers. These differences could result in different discharges being computed from identical data. Consistent computational algorithm, automated filtering, and quality assessment of ADCP streamflow measurements that are independent of the ADCP manufacturer are being developed in a software program that can process ADCP moving-boat discharge measurements independent of the ADCP used to collect the data.

  1. On controllability of an integrated bioreactor and periodically operated membrane separation process

    DEFF Research Database (Denmark)

    Prado Rubio, Oscar Andres; Jørgensen, Sten Bay; Jonsson, Gunnar Eigil

    the influence of membrane fouling. Previously, the REED and fermentation processes have been modeled and investigated separately (Prado- Rubio et al., 2011a; Boonmee, 2003). Additionally, a simple quasi-sequential strategy for integrated process design and control structure development has been proposed (Prado...... to understand the controlled operation of the integrated process, it is convenient to use a model based approach supported by experimental evidence. Recently, an integrated bioreactor and electrically driven membrane separation process (Reverse Electro- Enhanced Dialysis - REED) has been proposed as a method...... at a certain lactate concentration level. Hence, productivity can be enhanced by the in situ lactate removal from the cultivation broth during pH controlled fermentation. This can be done by means of ion exchange membranes and electrical potential gradients. The novelty of the integrated process lies...

  2. Heat and work integration: Fundamental insights and applications to carbon dioxide capture processes

    International Nuclear Information System (INIS)

    Fu, Chao; Gundersen, Truls

    2016-01-01

    Highlights: • The problem definition of heat and work integration is introduced. • The fundamental insights of heat and work integration are presented. • The design methodology is illustrated with two small test examples. • Applications of to three carbon dioxide capture processes are presented. - Abstract: The integration of heat has achieved a notable success in the past decades. Pinch Analysis is a well-established methodology for heat integration. Work is an equally important thermodynamic parameter. The enthalpy of a process stream can be changed by the transfer of heat and/or work. Heat and work are actually interchangeable and can thus be integrated. For example, compression processes consume more work at higher temperatures, however, the compression heat may be upgraded and utilized; expansion processes produce more work at higher temperatures, however, more heat may be required. The classical heat integration problem is thus extended to a new research topic about the integration of both heat and work. The aim of this paper is to present the problem definition, fundamental thermodynamic insights and industrial applications of heat and work integration. The results from studies on the three carbon dioxide capture processes show that significant energy savings can be achieved by proper heat and work integration. In the oxy-combustion process, the work consumption for cryogenic air separation is reduced by 10.1%. In the post-combustion membrane separation process, the specific work consumption for carbon dioxide separation is reduced by 12.9%. In the membrane air separation process, the net work consumption (excluding heat consumption) is reduced by 90%.

  3. Thermochemical production of liquid fuels from biomass: Thermo-economic modeling, process design and process integration analysis

    International Nuclear Information System (INIS)

    Tock, Laurence; Gassner, Martin; Marechal, Francois

    2010-01-01

    A detailed thermo-economic model combining thermodynamics with economic analysis and considering different technological alternatives for the thermochemical production of liquid fuels from lignocellulosic biomass is presented. Energetic and economic models for the production of Fischer-Tropsch fuel (FT), methanol (MeOH) and dimethyl ether (DME) by means of biomass drying with steam or flue gas, directly or indirectly heated fluidized bed or entrained flow gasification, hot or cold gas cleaning, fuel synthesis and upgrading are reviewed and developed. The process is integrated and the optimal utility system is computed. The competitiveness of the different process options is compared systematically with regard to energetic, economic and environmental considerations. At several examples, it is highlighted that process integration is a key element that allows for considerably increasing the performance by optimal utility integration and energy conversion. The performance computations of some exemplary technology scenarios of integrated plants yield overall energy efficiencies of 59.8% (crude FT-fuel), 52.5% (MeOH) and 53.5% (DME), and production costs of 89, 128 and 113 Euro MWh -1 on fuel basis. The applied process design approach allows to evaluate the economic competitiveness compared to fossil fuels, to study the influence of the biomass and electricity price and to project for different plant capacities. Process integration reveals in particular potential energy savings and waste heat valorization. Based on this work, the most promising options for the polygeneration of fuel, power and heat will be determined in a future thermo-economic optimization.

  4. Ethics and integrative medicine: moving beyond the biomedical model.

    Science.gov (United States)

    Guinn, D E

    2001-01-01

    For the most part, those who have written on the ethics of complementary and alternative medicine (CAM) and integrative medicine have attempted simply to apply traditional bioethics (in the form of principles of autonomy, beneficence, nonmaleficence, and justice) to this new area of healthcare. In this article I argue that adopting the practices of CAM requires a new ethical understanding that incorporates the values implicit in those practices. The characteristics of CAM and conventional medicine can be translated into the language of healthcare values in a variety of ways. I suggest that they support 5 core values: integrated humanity, ecological integrity, naturalism, relationalism, and spiritualism. Characteristics of both CAM and conventional medicine are present in value. What is now thought of as principlism is, in this understanding, simply a subset within these values.

  5. Vertically integrated circuit development at Fermilab for detectors

    International Nuclear Information System (INIS)

    Yarema, R; Deptuch, G; Hoff, J; Khalid, F; Lipton, R; Shenai, A; Trimpl, M; Zimmerman, T

    2013-01-01

    Today vertically integrated circuits, (a.k.a. 3D integrated circuits) is a popular topic in many trade journals. The many advantages of these circuits have been described such as higher speed due to shorter trace lenghts, the ability to reduce cross talk by placing analog and digital circuits on different levels, higher circuit density without the going to smaller feature sizes, lower interconnect capacitance leading to lower power, reduced chip size, and different processing for the various layers to optimize performance. There are some added advantages specifically for MAPS (Monolithic Active Pixel Sensors) in High Energy Physics: four side buttable pixel arrays, 100% diode fill factor, the ability to move PMOS transistors out of the diode sensing layer, and a increase in channel density. Fermilab began investigating 3D circuits in 2006. Many different bonding processes have been described for fabricating 3D circuits [1]. Fermilab has used three different processes to fabricate several circuits for specific applications in High Energy Physics and X-ray imaging. This paper covers some of the early 3D work at Fermilab and then moves to more recent activities. The major processes we have used are discussed and some of the problems encountered are described. An overview of pertinent 3D circuit designs is presented along with test results thus far.

  6. Moving standard deviation and moving sum of outliers as quality tools for monitoring analytical precision.

    Science.gov (United States)

    Liu, Jiakai; Tan, Chin Hon; Badrick, Tony; Loh, Tze Ping

    2018-02-01

    An increase in analytical imprecision (expressed as CV a ) can introduce additional variability (i.e. noise) to the patient results, which poses a challenge to the optimal management of patients. Relatively little work has been done to address the need for continuous monitoring of analytical imprecision. Through numerical simulations, we describe the use of moving standard deviation (movSD) and a recently described moving sum of outlier (movSO) patient results as means for detecting increased analytical imprecision, and compare their performances against internal quality control (QC) and the average of normal (AoN) approaches. The power of detecting an increase in CV a is suboptimal under routine internal QC procedures. The AoN technique almost always had the highest average number of patient results affected before error detection (ANPed), indicating that it had generally the worst capability for detecting an increased CV a . On the other hand, the movSD and movSO approaches were able to detect an increased CV a at significantly lower ANPed, particularly for measurands that displayed a relatively small ratio of biological variation to CV a. CONCLUSION: The movSD and movSO approaches are effective in detecting an increase in CV a for high-risk measurands with small biological variation. Their performance is relatively poor when the biological variation is large. However, the clinical risks of an increase in analytical imprecision is attenuated for these measurands as an increased analytical imprecision will only add marginally to the total variation and less likely to impact on the clinical care. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  7. Preparing Your Child for a Move

    Science.gov (United States)

    ... it. Kids can need some time and special attention during the transition. Try these tips to make the process less stressful for everyone. Making the Decision to Move Many kids thrive on familiarity and ...

  8. Elements for successful sensor-based process control {Integrated Metrology}

    International Nuclear Information System (INIS)

    Butler, Stephanie Watts

    1998-01-01

    Current productivity needs have stimulated development of alternative metrology, control, and equipment maintenance methods. Specifically, sensor applications provide the opportunity to increase productivity, tighten control, reduce scrap, and improve maintenance schedules and procedures. Past experience indicates a complete integrated solution must be provided for sensor-based control to be used successfully in production. In this paper, Integrated Metrology is proposed as the term for an integrated solution that will result in a successful application of sensors for process control. This paper defines and explores the perceived four elements of successful sensor applications: business needs, integration, components, and form. Based upon analysis of existing successful commercially available controllers, the necessary business factors have been determined to be strong, measurable industry-wide business needs whose solution is profitable and feasible. This paper examines why the key aspect of integration is the decision making process. A detailed discussion is provided of the components of most importance to sensor based control: decision-making methods, the 3R's of sensors, and connectivity. A metric for one of the R's (resolution) is proposed to allow focus on this important aspect of measurement. A form for these integrated components which synergistically partitions various aspects of control at the equipment and MES levels to efficiently achieve desired benefits is recommended

  9. Elements for successful sensor-based process control {Integrated Metrology}

    Science.gov (United States)

    Butler, Stephanie Watts

    1998-11-01

    Current productivity needs have stimulated development of alternative metrology, control, and equipment maintenance methods. Specifically, sensor applications provide the opportunity to increase productivity, tighten control, reduce scrap, and improve maintenance schedules and procedures. Past experience indicates a complete integrated solution must be provided for sensor-based control to be used successfully in production. In this paper, Integrated Metrology is proposed as the term for an integrated solution that will result in a successful application of sensors for process control. This paper defines and explores the perceived four elements of successful sensor applications: business needs, integration, components, and form. Based upon analysis of existing successful commercially available controllers, the necessary business factors have been determined to be strong, measurable industry-wide business needs whose solution is profitable and feasible. This paper examines why the key aspect of integration is the decision making process. A detailed discussion is provided of the components of most importance to sensor based control: decision-making methods, the 3R's of sensors, and connectivity. A metric for one of the R's (resolution) is proposed to allow focus on this important aspect of measurement. A form for these integrated components which synergistically partitions various aspects of control at the equipment and MES levels to efficiently achieve desired benefits is recommended.

  10. Development of Spreadsheet-Based Integrated Transaction Processing Systems and Financial Reporting Systems

    Science.gov (United States)

    Ariana, I. M.; Bagiada, I. M.

    2018-01-01

    Development of spreadsheet-based integrated transaction processing systems and financial reporting systems is intended to optimize the capabilities of spreadsheet in accounting data processing. The purpose of this study are: 1) to describe the spreadsheet-based integrated transaction processing systems and financial reporting systems; 2) to test its technical and operational feasibility. This study type is research and development. The main steps of study are: 1) needs analysis (need assessment); 2) developing spreadsheet-based integrated transaction processing systems and financial reporting systems; and 3) testing the feasibility of spreadsheet-based integrated transaction processing systems and financial reporting systems. The technical feasibility include the ability of hardware and operating systems to respond the application of accounting, simplicity and ease of use. Operational feasibility include the ability of users using accounting applications, the ability of accounting applications to produce information, and control applications of the accounting applications. The instrument used to assess the technical and operational feasibility of the systems is the expert perception questionnaire. The instrument uses 4 Likert scale, from 1 (strongly disagree) to 4 (strongly agree). Data were analyzed using percentage analysis by comparing the number of answers within one (1) item by the number of ideal answer within one (1) item. Spreadsheet-based integrated transaction processing systems and financial reporting systems integrate sales, purchases, and cash transaction processing systems to produce financial reports (statement of profit or loss and other comprehensive income, statement of changes in equity, statement of financial position, and statement of cash flows) and other reports. Spreadsheet-based integrated transaction processing systems and financial reporting systems is feasible from the technical aspects (87.50%) and operational aspects (84.17%).

  11. Human Capital Theory and Internal Migration: Do Average Outcomes Distort Our View of Migrant Motives?

    Science.gov (United States)

    Korpi, Martin; Clark, William A W

    2017-05-01

    By modelling the distribution of percentage income gains for movers in Sweden, using multinomial logistic regression, this paper shows that those receiving large pecuniary returns from migration are primarily those moving to the larger metropolitan areas and those with higher education, and that there is much more variability in income gains than what is often assumed in models of average gains to migration. This suggests that human capital models of internal migration often overemphasize the job and income motive for moving, and fail to explore where and when human capital motivated migration occurs.

  12. Dynamic Complexity Study of Nuclear Reactor and Process Heat Application Integration

    Energy Technology Data Exchange (ETDEWEB)

    J' Tia Patrice Taylor; David E. Shropshire

    2009-09-01

    Abstract This paper describes the key obstacles and challenges facing the integration of nuclear reactors with process heat applications as they relate to dynamic issues. The paper also presents capabilities of current modeling and analysis tools available to investigate these issues. A pragmatic approach to an analysis is developed with the ultimate objective of improving the viability of nuclear energy as a heat source for process industries. The extension of nuclear energy to process heat industries would improve energy security and aid in reduction of carbon emissions by reducing demands for foreign derived fossil fuels. The paper begins with an overview of nuclear reactors and process application for potential use in an integrated system. Reactors are evaluated against specific characteristics that determine their compatibility with process applications such as heat outlet temperature. The reactor system categories include light water, heavy water, small to medium, near term high-temperature, and far term high temperature reactors. Low temperature process systems include desalination, district heating, and tar sands and shale oil recovery. High temperature processes that support hydrogen production include steam reforming, steam cracking, hydrogen production by electrolysis, and far-term applications such as the sulfur iodine chemical process and high-temperature electrolysis. A simple static matching between complementary systems is performed; however, to gain a true appreciation for system integration complexity, time dependent dynamic analysis is required. The paper identifies critical issues arising from dynamic complexity associated with integration of systems. Operational issues include scheduling conflicts and resource allocation for heat and electricity. Additionally, economic and safety considerations that could impact the successful integration of these systems are considered. Economic issues include the cost differential arising due to an integrated

  13. Performance analysis of solar energy integrated with natural-gas-to-methanol process

    International Nuclear Information System (INIS)

    Yang, Sheng; Liu, Zhiqiang; Tang, Zhiyong; Wang, Yifan; Chen, Qianqian; Sun, Yuhan

    2017-01-01

    Highlights: • Solar energy integrated with natural-gas-to-methanol process is proposed. • The two processes are modeled and simulated. • Performance analysis of the two processes are conducted. • The proposed process can cut down the greenhouse gas emission. • The proposed process can save natural gas consumption. - Abstract: Methanol is an important platform chemical. Methanol production using natural gas as raw material has short processing route and well developed equipment and technology. However, natural gas reserves are not large in China. Solar energy power generation system integrated with natural-gas-to-methanol (NGTM) process is developed, which may provide a technical routine for methanol production in the future. The solar energy power generation produces electricity for reforming unit and system consumption in solar energy integrated natural-gas-to-methanol system (SGTM). Performance analysis of conventional natural-gas-to-methanol process and solar energy integrated with natural-gas-to-methanol process are presented based on simulation results. Performance analysis was conducted considering carbon efficiency, production cost, solar energy price, natural gas price, and carbon tax. Results indicate that solar energy integrated with natural-gas-to-methanol process is able to cut down the greenhouse gas (GHG) emission. In addition, solar energy can replace natural gas as fuel. This can reduce the consumption of natural gas, which equals to 9.2% of the total consumed natural gas. However, it is not economical considering the current technology readiness level, compared with conventional natural-gas-to-methanol process.

  14. Testing and modelling autoregressive conditional heteroskedasticity of streamflow processes

    Directory of Open Access Journals (Sweden)

    W. Wang

    2005-01-01

    Full Text Available Conventional streamflow models operate under the assumption of constant variance or season-dependent variances (e.g. ARMA (AutoRegressive Moving Average models for deseasonalized streamflow series and PARMA (Periodic AutoRegressive Moving Average models for seasonal streamflow series. However, with McLeod-Li test and Engle's Lagrange Multiplier test, clear evidences are found for the existence of autoregressive conditional heteroskedasticity (i.e. the ARCH (AutoRegressive Conditional Heteroskedasticity effect, a nonlinear phenomenon of the variance behaviour, in the residual series from linear models fitted to daily and monthly streamflow processes of the upper Yellow River, China. It is shown that the major cause of the ARCH effect is the seasonal variation in variance of the residual series. However, while the seasonal variation in variance can fully explain the ARCH effect for monthly streamflow, it is only a partial explanation for daily flow. It is also shown that while the periodic autoregressive moving average model is adequate in modelling monthly flows, no model is adequate in modelling daily streamflow processes because none of the conventional time series models takes the seasonal variation in variance, as well as the ARCH effect in the residuals, into account. Therefore, an ARMA-GARCH (Generalized AutoRegressive Conditional Heteroskedasticity error model is proposed to capture the ARCH effect present in daily streamflow series, as well as to preserve seasonal variation in variance in the residuals. The ARMA-GARCH error model combines an ARMA model for modelling the mean behaviour and a GARCH model for modelling the variance behaviour of the residuals from the ARMA model. Since the GARCH model is not followed widely in statistical hydrology, the work can be a useful addition in terms of statistical modelling of daily streamflow processes for the hydrological community.

  15. Anticipatory Cyber Security Research: An Ultimate Technique for the First-Move Advantage

    Directory of Open Access Journals (Sweden)

    Bharat S.Rawal

    2016-02-01

    Full Text Available Across all industry segments, 96 percent of systems could be breached on average. In the game of cyber security, every moment a new player (attacker is entering the game with new skill sets. An attacker only needs to be effective once while defenders of cyberspace have to be successful all of the time. There will be a first-mover advantage in such a chasing game, which means that the first move often wins. In this paper, in order to face the security challenges brought in by attacker’s first move advantage, we analyzed the past ten years of cyber-attacks, studied the immediate attack’s pattern and offer the tools to predict the next move of the cyber attacker.

  16. A template for integrated community sustainability planning.

    Science.gov (United States)

    Ling, Christopher; Hanna, Kevin; Dale, Ann

    2009-08-01

    This article describes a template for implementing an integrated community sustainability plan. The template emphasizes community engagement and outlines the components of a basic framework for integrating ecological, social and economic dynamics into a community plan. The framework is a series of steps that support a sustainable community development process. While it reflects the Canadian experience, the tools and techniques have applied value for a range of environmental planning contexts around the world. The research is case study based and draws from a diverse range of communities representing many types of infrastructure, demographics and ecological and geographical contexts. A critical path for moving local governments to sustainable community development is the creation and implementation of integrated planning approaches. To be effective and to be implemented, a requisite shift to sustainability requires active community engagement processes, political will, and a commitment to political and administrative accountability, and measurement.

  17. Integrated biological, chemical and physical processes kinetic ...

    African Journals Online (AJOL)

    ... for C and N removal, only gas and liquid phase processes were considered for this integrated model. ... kLA value for the aeration system, which affects the pH in the anoxic and aerobic reactors through CO2 gas exchange. ... Water SA Vol.

  18. Sensorimotor integration in dyslexic children under different sensory stimulations.

    Directory of Open Access Journals (Sweden)

    André R Viana

    Full Text Available Dyslexic children, besides difficulties in mastering literacy, also show poor postural control that might be related to how sensory cues coming from different sensory channels are integrated into proper motor activity. Therefore, the aim of this study was to examine the relationship between sensory information and body sway, with visual and somatosensory information manipulated independent and concurrently, in dyslexic children. Thirty dyslexic and 30 non-dyslexic children were asked to stand as still as possible inside of a moving room either with eyes closed or open and either lightly touching a moveable surface or not for 60 seconds under five experimental conditions: (1 no vision and no touch; (2 moving room; (3 moving bar; (4 moving room and stationary touch; and (5 stationary room and moving bar. Body sway magnitude and the relationship between room/bar movement and body sway were examined. Results showed that dyslexic children swayed more than non-dyslexic children in all sensory condition. Moreover, in those trials with conflicting vision and touch manipulation, dyslexic children swayed less coherent with the stimulus manipulation compared to non-dyslexic children. Finally, dyslexic children showed higher body sway variability and applied higher force while touching the bar compared to non-dyslexic children. Based upon these results, we can suggest that dyslexic children are able to use visual and somatosensory information to control their posture and use the same underlying neural control processes as non-dyslexic children. However, dyslexic children show poorer performance and more variability while relating visual and somatosensory information and motor action even during a task that does not require an active cognitive and motor involvement. Further, in sensory conflict conditions, dyslexic children showed less coherent and more variable body sway. These results suggest that dyslexic children have difficulties in multisensory

  19. Simulation of a processes of a moving base coating with uniform films by method of physical deposition

    International Nuclear Information System (INIS)

    Avilov, A.A.; Grigorevskij, A.V.; Dudnik, S.F.; Kiryukhin, N.M.; Klyukovich, V.A.; Sagalovich, V.V.

    1989-01-01

    Computational algorithm is developed for calculating thickness of films deposited by physical methods onto a backing of any shape, moving along a given trajectory. The sugegsted algorithm makes it possible to carry out direct simulation on film deposition process and to optimize sources arrangement for obtaining films with a required degree of uniformity. Condensate distribution on a rotating sphere was calculated and here presented. A satisfactory agreement of calculated values with experimental data on metal films obtained by electron-arc spraying, was established

  20. Efficient Continuously Moving Top-K Spatial Keyword Query Processing

    DEFF Research Database (Denmark)

    Wu, Dinming; Yiu, Man Lung; Jensen, Christian Søndergaard

    2011-01-01

    safe zones that guarantee correct results at any time and that aim to optimize the computation on the server as well as the communication between the server and the client. We exploit tight and conservative approximations of safe zones and aggressive computational space pruning. Empirical studies...... keyword data. State-of-the-art solutions for moving queries employ safe zones that guarantee the validity of reported results as long as the user remains within a zone. However, existing safe zone methods focus solely on spatial locations and ignore text relevancy. We propose two algorithms for computing...

  1. Statistics on exponential averaging of periodograms

    Energy Technology Data Exchange (ETDEWEB)

    Peeters, T.T.J.M. [Netherlands Energy Research Foundation (ECN), Petten (Netherlands); Ciftcioglu, Oe. [Istanbul Technical Univ. (Turkey). Dept. of Electrical Engineering

    1994-11-01

    The algorithm of exponential averaging applied to subsequent periodograms of a stochastic process is used to estimate the power spectral density (PSD). For an independent process, assuming the periodogram estimates to be distributed according to a {chi}{sup 2} distribution with 2 degrees of freedom, the probability density function (PDF) of the PSD estimate is derived. A closed expression is obtained for the moments of the distribution. Surprisingly, the proof of this expression features some new insights into the partitions and Eulers infinite product. For large values of the time constant of the averaging process, examination of the cumulant generating function shows that the PDF approximates the Gaussian distribution. Although restrictions for the statistics are seemingly tight, simulation of a real process indicates a wider applicability of the theory. (orig.).

  2. Statistics on exponential averaging of periodograms

    International Nuclear Information System (INIS)

    Peeters, T.T.J.M.; Ciftcioglu, Oe.

    1994-11-01

    The algorithm of exponential averaging applied to subsequent periodograms of a stochastic process is used to estimate the power spectral density (PSD). For an independent process, assuming the periodogram estimates to be distributed according to a χ 2 distribution with 2 degrees of freedom, the probability density function (PDF) of the PSD estimate is derived. A closed expression is obtained for the moments of the distribution. Surprisingly, the proof of this expression features some new insights into the partitions and Eulers infinite product. For large values of the time constant of the averaging process, examination of the cumulant generating function shows that the PDF approximates the Gaussian distribution. Although restrictions for the statistics are seemingly tight, simulation of a real process indicates a wider applicability of the theory. (orig.)

  3. Development of integrated control system for smart factory in the injection molding process

    Science.gov (United States)

    Chung, M. J.; Kim, C. Y.

    2018-03-01

    In this study, we proposed integrated control system for automation of injection molding process required for construction of smart factory. The injection molding process consists of heating, tool close, injection, cooling, tool open, and take-out. Take-out robot controller, image processing module, and process data acquisition interface module are developed and assembled to integrated control system. By adoption of integrated control system, the injection molding process can be simplified and the cost for construction of smart factory can be inexpensive.

  4. The Acceleration of Charged Particles at a Spherical Shock Moving through an Irregular Magnetic Field

    Energy Technology Data Exchange (ETDEWEB)

    Giacalone, J. [Department of Planetary Sciences, University of Arizona, Tucson, AZ (United States)

    2017-10-20

    We investigate the physics of charged-particle acceleration at spherical shocks moving into a uniform plasma containing a turbulent magnetic field with a uniform mean. This has applications to particle acceleration at astrophysical shocks, most notably, to supernovae blast waves. We numerically integrate the equations of motion of a large number of test protons moving under the influence of electric and magnetic fields determined from a kinematically defined plasma flow associated with a radially propagating blast wave. Distribution functions are determined from the positions and velocities of the protons. The unshocked plasma contains a magnetic field with a uniform mean and an irregular component having a Kolmogorov-like power spectrum. The field inside the blast wave is determined from Maxwell’s equations. The angle between the average magnetic field and unit normal to the shock varies with position along its surface. It is quasi-perpendicular to the unit normal near the sphere’s equator, and quasi-parallel to it near the poles. We find that the highest intensities of particles, accelerated by the shock, are at the poles of the blast wave. The particles “collect” at the poles as they approximately adhere to magnetic field lines that move poleward from their initial encounter with the shock at the equator, as the shock expands. The field lines at the poles have been connected to the shock the longest. We also find that the highest-energy protons are initially accelerated near the equator or near the quasi-perpendicular portion of the shock, where the acceleration is more rapid.

  5. Monitoring of laser material processing using machine integrated low-coherence interferometry

    Science.gov (United States)

    Kunze, Rouwen; König, Niels; Schmitt, Robert

    2017-06-01

    Laser material processing has become an indispensable tool in modern production. With the availability of high power pico- and femtosecond laser sources, laser material processing is advancing into applications, which demand for highest accuracies such as laser micro milling or laser drilling. In order to enable narrow tolerance windows, a closedloop monitoring of the geometrical properties of the processed work piece is essential for achieving a robust manufacturing process. Low coherence interferometry (LCI) is a high-precision measuring principle well-known from surface metrology. In recent years, we demonstrated successful integrations of LCI into several different laser material processing methods. Within this paper, we give an overview about the different machine integration strategies, that always aim at a complete and ideally telecentric integration of the measurement device into the existing beam path of the processing laser. Thus, highly accurate depth measurements within machine coordinates and a subsequent process control and quality assurance are possible. First products using this principle have already found its way to the market, which underlines the potential of this technology for the monitoring of laser material processing.

  6. Trends And Economic Assessment Of Integration Processes At The Metal Market

    OpenAIRE

    Olga Aleksandrovna Romanova; Eduard Vyacheslavovich Makarov

    2015-01-01

    The article discussed the integration process from the perspective of three dimensions that characterize the corresponding increase in the number and appearance of new relationships; strength, character, and stability of emerging communications; dynamics and the appropriate form of the process. In the article, trends of development of integration processes in metallurgy are identified, identification of five stages of development in Russian metal trading are justified. We propose a s...

  7. Switching moving boundary models for two-phase flow evaporators and condensers

    Science.gov (United States)

    Bonilla, Javier; Dormido, Sebastián; Cellier, François E.

    2015-03-01

    The moving boundary method is an appealing approach for the design, testing and validation of advanced control schemes for evaporators and condensers. When it comes to advanced control strategies, not only accurate but fast dynamic models are required. Moving boundary models are fast low-order dynamic models, and they can describe the dynamic behavior with high accuracy. This paper presents a mathematical formulation based on physical principles for two-phase flow moving boundary evaporator and condenser models which support dynamic switching between all possible flow configurations. The models were implemented in a library using the equation-based object-oriented Modelica language. Several integrity tests in steady-state and transient predictions together with stability tests verified the models. Experimental data from a direct steam generation parabolic-trough solar thermal power plant is used to validate and compare the developed moving boundary models against finite volume models.

  8. Methodology for optimization of process integration schemes in a biorefinery under uncertainty

    International Nuclear Information System (INIS)

    Marta Abreu de las Villas (Cuba))" data-affiliation=" (Departamento de Ingeniería Química. Facultad de Química y Farmacia. Universidad Central Marta Abreu de las Villas (Cuba))" >González-Cortés, Meilyn; Marta Abreu de las Villas (Cuba))" data-affiliation=" (Departamento de Ingeniería Química. Facultad de Química y Farmacia. Universidad Central Marta Abreu de las Villas (Cuba))" >Martínez-Martínez, Yenisleidys; Marta Abreu de las Villas (Cuba))" data-affiliation=" (Departamento de Ingeniería Química. Facultad de Química y Farmacia. Universidad Central Marta Abreu de las Villas (Cuba))" >Albernas-Carvajal, Yailet; Marta Abreu de las Villas (Cuba))" data-affiliation=" (Departamento de Ingeniería Química. Facultad de Química y Farmacia. Universidad Central Marta Abreu de las Villas (Cuba))" >Pedraza-Garciga, Julio; Marta Abreu de las Villas (Cuba))" data-affiliation=" (Departamento de Ingeniería Química. Facultad de Química y Farmacia. Universidad Central Marta Abreu de las Villas (Cuba))" >Morales-Zamora, Marlen

    2017-01-01

    The uncertainty has a great impact in the investment decisions, operability of the plants and in the feasibility of integration opportunities in the chemical processes. This paper, presents the steps to consider the optimization of process investment in the processes integration under conditions of uncertainty. It is shown the potentialities of the biomass cane of sugar for the integration with several plants in a biorefinery scheme for the obtaining chemical products, thermal and electric energy. Among the factories with potentialities for this integration are the pulp and paper and sugar factories and other derivative processes. Theses factories have common resources and also have a variety of products that can be exchange between them so certain products generated in a one of them can be raw matter in another plant. The methodology developed guide to obtaining of feasible investment projects under uncertainty. As objective function was considered the maximization of net profitable value in different scenarios that are generated from the integration scheme. (author)

  9. Computer-integrated electric-arc melting process control system

    OpenAIRE

    Дёмин, Дмитрий Александрович

    2014-01-01

    Developing common principles of completing melting process automation systems with hardware and creating on their basis rational choices of computer- integrated electricarc melting control systems is an actual task since it allows a comprehensive approach to the issue of modernizing melting sites of workshops. This approach allows to form the computer-integrated electric-arc furnace control system as part of a queuing system “electric-arc furnace - foundry conveyor” and consider, when taking ...

  10. Recent Development of an Earth Science App - FieldMove Clino

    Science.gov (United States)

    Vaughan, Alan; Collins, Nathan; Krus, Mike; Rourke, Peter

    2014-05-01

    As geological modelling and analysis move into 3D digital space, it becomes increasingly important to be able to rapidly integrate new data with existing databases, without the potential degradation caused by repeated manual transcription of numeric, graphical and meta-data. Digital field mapping offers significant benefits when compared with traditional paper mapping techniques, in that it can directly and interactively feed and be guided by downstream geological modelling and analysis. One of the most important pieces of equipment used by the field geologists is the compass clinometer. Midland Valley's development team have recently release their highly anticipated FieldMove Clino App. FieldMove Clino is a digital compass-clinometer for data capture on a smartphone. The app allows the user to use their phone as a traditional hand-held bearing compass, as well as a digital compass-clinometer for rapidly measuring and capturing the georeferenced location and orientation of planar and linear features in the field. The user can also capture and store digital photographs and text notes. FieldMove Clino supports online Google Maps as well as offline maps, so that the user can import their own georeferenced basemaps. Data can be exported as comma-separated values (.csv) or Move™ (.mve) files and then imported directly into FieldMove™, Move™ or other applications. Midland Valley is currently pioneering tablet-based mapping and, along with its industrial and academic partners, will be using the application in field based projects throughout this year and will be integrating feedback in further developments of this technology.

  11. Economic Benefit from Progressive Integration of Scheduling and Control for Continuous Chemical Processes

    Directory of Open Access Journals (Sweden)

    Logan D. R. Beal

    2017-12-01

    Full Text Available Performance of integrated production scheduling and advanced process control with disturbances is summarized and reviewed with four progressive stages of scheduling and control integration and responsiveness to disturbances: open-loop segregated scheduling and control, closed-loop segregated scheduling and control, open-loop scheduling with consideration of process dynamics, and closed-loop integrated scheduling and control responsive to process disturbances and market fluctuations. Progressive economic benefit from dynamic rescheduling and integrating scheduling and control is shown on a continuously stirred tank reactor (CSTR benchmark application in closed-loop simulations over 24 h. A fixed horizon integrated scheduling and control formulation for multi-product, continuous chemical processes is utilized, in which nonlinear model predictive control (NMPC and continuous-time scheduling are combined.

  12. Integrating ergonomics into the product development process

    DEFF Research Database (Denmark)

    Broberg, Ole

    1997-01-01

    and production engineers regarding information sources in problem solving, communication pattern, perception of ergonomics, motivation and requests to support tools and methods. These differences and the social and organizational contexts of the development process must be taken into account when considering......A cross-sectional case study was performed in a large company producing electro-mechanical products for industrial application. The purpose was to elucidate conditions and strategies for integrating ergonomics into the product development process thereby preventing ergonomic problems at the time...... of manufacture of new products. In reality the product development process is not a rational problem solving process and does not proceed in a sequential manner as decribed in engineering models. Instead it is a complex organizational process involving uncertainties, iterative elements and negotiation between...

  13. Business process integration between European manufacturers and transport and logistics service providers

    DEFF Research Database (Denmark)

    Mortensen, Ole; Lemoine, W

    2005-01-01

    The goal of the Supply Chain Management process is to create value for customers, stakeholders and all supply chain members, through the integration of disparate processes like manufacturing flow management, customer service and order fulfillment. However, many firms fail in the path of achieving...... a total integration. This study illustrates, from an empirical point of view, the problems associated to SC integration among European firms operating in global/international markets. The focus is on the relationship between two echelons in the supply chain: manufacturers and their transport and logistics...... service providers (TLSPs). The paper examines (1) the characteristics of the collaborative partnerships established between manufacturers and their TLSPs; (2) to what extent manufacturers and their TLSPs have integrated SC business processes; (3) the IT used to support the SC cooperation and integration...

  14. Sustainable Chemical Process Development through an Integrated Framework

    DEFF Research Database (Denmark)

    Papadakis, Emmanouil; Kumar Tula, Anjan; Anantpinijwatna, Amata

    2016-01-01

    This paper describes the development and the application of a general integrated framework based on systematic model-based methods and computer-aided tools with the objective to achieve more sustainable process designs and to improve the process understanding. The developed framework can be appli...... studies involve multiphase reaction systems for the synthesis of active pharmaceutical ingredients....

  15. Supporting BPMN choreography with system integration artefacts for enterprise process collaboration

    Science.gov (United States)

    Nie, Hongchao; Lu, Xudong; Duan, Huilong

    2014-07-01

    Business Process Model and Notation (BPMN) choreography modelling depicts externally visible message exchanges between collaborating processes of enterprise information systems. Implementation of choreography relies on designing system integration solutions to realise message exchanges between independently developed systems. Enterprise integration patterns (EIPs) are widely accepted artefacts to design integration solutions. If the choreography model represents coordination requirements between processes with behaviour mismatches, the integration designer needs to analyse the routing requirements and address these requirements by manually designing EIP message routers. As collaboration scales and complexity increases, manual design becomes inefficient. Thus, the research problem of this paper is to explore a method to automatically identify routing requirements from BPMN choreography model and to accordingly design routing in the integration solution. To achieve this goal, recurring behaviour mismatch scenarios are analysed as patterns, and corresponding solutions are proposed as EIP routers. Using this method, a choreography model can be analysed by computer to identify occurrences of mismatch patterns, leading to corresponding router selection. A case study demonstrates that the proposed method enables computer-assisted integration design to implement choreography. A further experiment reveals that the method is effective to improve the design quality and reduce time cost.

  16. Adding rectifying/stripping section type heat integration to a pressure-swing distillation (PSD) process

    International Nuclear Information System (INIS)

    Huang Kejin; Shan Lan; Zhu Qunxiong; Qian Jixin

    2008-01-01

    This paper studies the economical effect of considering rectifying/stripping section type heat integration in a pressure-swing distillation (PSD) process separating a binary homogeneous pressure-sensitive azeotrope. The schemes for arranging heat integration between the rectifying section and the stripping section of the high- and low-pressure distillation columns, respectively, are derived and an effective procedure is devised for the conceptual process design of the heat-integrated PSD processes. In terms of the separation of a binary azeotropic mixture of acetonitrile and water, intensive comparisons are made between the conventional and heat-integrated PSD processes. It is demonstrated that breaking a pressure-sensitive azeotropic mixture can be made more economical than the current practice with the conventional PSD process. For boosting further the thermodynamic efficiency of a PSD process, it is strongly suggested to consider simultaneously the condenser/reboiler type heat integration with the rectifying/stripping section type heat integration in process synthesis and design

  17. Adding rectifying/stripping section type heat integration to a pressure-swing distillation (PSD) process

    Energy Technology Data Exchange (ETDEWEB)

    Huang Kejin [School of Information Science and Technology, Beijing University of Chemical Technology, Chaoyang-qu, Beijing-shi, Beijing 100029 (China)], E-mail: huangkj@mail.buct.edu.cn; Shan Lan; Zhu Qunxiong [School of Information Science and Technology, Beijing University of Chemical Technology, Chaoyang-qu, Beijing-shi, Beijing 100029 (China); Qian Jixin [School of Information Science and Technology, Zhejiang University, Xihu-qu, Hangzhou-shi, Zhejiang 300027 (China)

    2008-06-15

    This paper studies the economical effect of considering rectifying/stripping section type heat integration in a pressure-swing distillation (PSD) process separating a binary homogeneous pressure-sensitive azeotrope. The schemes for arranging heat integration between the rectifying section and the stripping section of the high- and low-pressure distillation columns, respectively, are derived and an effective procedure is devised for the conceptual process design of the heat-integrated PSD processes. In terms of the separation of a binary azeotropic mixture of acetonitrile and water, intensive comparisons are made between the conventional and heat-integrated PSD processes. It is demonstrated that breaking a pressure-sensitive azeotropic mixture can be made more economical than the current practice with the conventional PSD process. For boosting further the thermodynamic efficiency of a PSD process, it is strongly suggested to consider simultaneously the condenser/reboiler type heat integration with the rectifying/stripping section type heat integration in process synthesis and design.

  18. The asymptotic and exact Fisher information matrices of a vector ARMA process

    NARCIS (Netherlands)

    Klein, A.; Melard, G.; Saidi, A.

    2008-01-01

    The exact Fisher information matrix of a Gaussian vector autoregressive-moving average (VARMA) process has been considered for a time series of length N in relation to the exact maximum likelihood estimation method. In this paper it is shown that the Gaussian exact Fisher information matrix

  19. Advanced pulse oximeter signal processing technology compared to simple averaging. I. Effect on frequency of alarms in the operating room.

    Science.gov (United States)

    Rheineck-Leyssius, A T; Kalkman, C J

    1999-05-01

    To determine the effect of a new signal processing technique (Oxismart, Nellcor, Inc., Pleasanton, CA) on the incidence of false pulse oximeter alarms in the operating room (OR). Prospective observational study. Nonuniversity hospital. 53 ASA physical status I, II, and III consecutive patients undergoing general anesthesia with tracheal intubation. In the OR we compared the number of alarms produced by a recently developed third generation pulse oximeter (Nellcor Symphony N-3000) with Oxismart signal processing technique and a conventional pulse oximeter (Criticare 504). Three pulse oximeters were used simultaneously in each patient: a Nellcor pulse oximeter, a Criticare with the signal averaging time set at 3 seconds (Criticareaverage3s) and a similar unit with the signal averaging time set at 21 seconds (Criticareaverage21s). For each pulse oximeter, the number of false (artifact) alarms was counted. One false alarm was produced by the Nellcor (duration 55 sec) and one false alarm by the Criticareaverage21s monitor (5 sec). The incidence of false alarms was higher in Criticareaverage3s. In eight patients, Criticareaverage3s produced 20 false alarms (p signal processing compared with the Criticare monitor with the longer averaging time of 21 seconds.

  20. Moving finite element method aided by computerized symbolic manipulation and its application to dynamic fracture simulation

    International Nuclear Information System (INIS)

    Nishioka, Toshihisa; Takemoto, Yutaka

    1988-01-01

    Recently, the authors have shown that the combined method of the path-independent J' integral (dynamic J integral) and a moving isoparametric element procedure is an effective tool for the calculation of dynamic stress intensity factors. In the moving element procedure, the nodal pattern of the elements near a crack tip moves according to the motion of the crack-tip. An iterative numerical technique was used in the previous procedure to find the natural coordinates (ξ, η) at the newly created nodes. This technique requires additional computing time because of the nature of iteration. In the present paper, algebraic expressions for the transformation of the global coordinates (x, y) to the natural coordinates (ξ, η) were obtained by using a computerized symbolic manipulation system (REDUCE 3.2). These algebraic expressions are also very useful for remeshing or zooming techniques often used in finite element analysis. The present moving finite element method demonstrates its effectiveness for the simulation of a fast fracture. (author)

  1. Moving related to separation : who moves and to what distance

    NARCIS (Netherlands)

    Mulder, Clara H.; Malmberg, Gunnar

    We address the issue of moving from the joint home on the occasion of separation. Our research question is: To what extent can the occurrence of moves related to separation, and the distance moved, be explained by ties to the location, resources, and other factors influencing the likelihood of

  2. Integrating discount usability in scrum development process in Ethiopia

    DEFF Research Database (Denmark)

    Teka, Degif; Dittrich, Y.; Kifle, Mesfin

    2017-01-01

    be adapted and integrated into the Scrum-agile development with especial emphasis on the Ethiopian context. The research aims at adapting software engineering and ICT development methods to the specific situation and integrating user-centered design (UCD) and lightweight usability methods into agile...... end users and developers. Culturally adapted user pair testing and heuristic evaluation supported usability testing and supported developers in getting early feedback. Integrated approach of discount usability with the Scrum process has been developed and evaluated first with the involved...

  3. Moving Beyond ERP Components: A Selective Review of Approaches to Integrate EEG and Behavior

    Science.gov (United States)

    Bridwell, David A.; Cavanagh, James F.; Collins, Anne G. E.; Nunez, Michael D.; Srinivasan, Ramesh; Stober, Sebastian; Calhoun, Vince D.

    2018-01-01

    Relationships between neuroimaging measures and behavior provide important clues about brain function and cognition in healthy and clinical populations. While electroencephalography (EEG) provides a portable, low cost measure of brain dynamics, it has been somewhat underrepresented in the emerging field of model-based inference. We seek to address this gap in this article by highlighting the utility of linking EEG and behavior, with an emphasis on approaches for EEG analysis that move beyond focusing on peaks or “components” derived from averaging EEG responses across trials and subjects (generating the event-related potential, ERP). First, we review methods for deriving features from EEG in order to enhance the signal within single-trials. These methods include filtering based on user-defined features (i.e., frequency decomposition, time-frequency decomposition), filtering based on data-driven properties (i.e., blind source separation, BSS), and generating more abstract representations of data (e.g., using deep learning). We then review cognitive models which extract latent variables from experimental tasks, including the drift diffusion model (DDM) and reinforcement learning (RL) approaches. Next, we discuss ways to access associations among these measures, including statistical models, data-driven joint models and cognitive joint modeling using hierarchical Bayesian models (HBMs). We think that these methodological tools are likely to contribute to theoretical advancements, and will help inform our understandings of brain dynamics that contribute to moment-to-moment cognitive function. PMID:29632480

  4. Experimental demonstration of squeezed-state quantum averaging

    DEFF Research Database (Denmark)

    Lassen, Mikael Østergaard; Madsen, Lars Skovgaard; Sabuncu, Metin

    2010-01-01

    We propose and experimentally demonstrate a universal quantum averaging process implementing the harmonic mean of quadrature variances. The averaged variances are prepared probabilistically by means of linear optical interference and measurement-induced conditioning. We verify that the implemented...

  5. Design Process for Integrated Concepts with Responsive Building Elements

    DEFF Research Database (Denmark)

    Aa, Van der A.; Heiselberg, Per

    2008-01-01

    An integrated building concept is a prerequisite to come to an energy efficient building with a good and healthy IAQ indoor comfort. A design process that defines the targets and boundary conditions in the very first stage of the design and guarantees them until the building is finished and used...... is needed. The hard question is however: how to make the right choice of the combination of individual measures from building components and building services elements. Within the framework of IEA-ECBCS Annex 44 research has been conducted about the design process for integrated building concepts...

  6. Customer-focused planning: Beyond integrated resource planning

    International Nuclear Information System (INIS)

    Hastings, P.C.

    1992-01-01

    Integrated resource planning (IRP) evolved from the growing recognition by utilities and regulators that efforts to influence the use of electricity by customers could be more cost-effective than simply expanding the generation system. Improvements in IRP methodology are taking many different forms. One major effort is to move planning closer to the customer. Customer-focused planning (CFP) starts with customer values and uses these to drive decision-making within the utility. CFP is process- rather than product- oriented and typically operates at the bulk power system level. Options available to meet customer needs include electricity, alternative fuels, capital substitution, and end-use management or control. The customer selects the option(s) based on a value set that typically includes safety, reliability, convenience, and cost. There are also four possible levels of decision-making: the end-use; customer/power meter; transmission/distribution interface; and the utility bulk power system. Challenges of implementing CFP include identifying customer wants, needs, and values; integration of utility planning efforts; and the dynamics of the CFP process, in which costs can change with each modification of the transmission and distribution system. Two examples of recent moves toward CFP at Central Maine Power are reviewed. 2 refs., 1 fig

  7. Production of citric acid using its extraction wastewater treated by anaerobic digestion and ion exchange in an integrated citric acid-methane fermentation process.

    Science.gov (United States)

    Xu, Jian; Chen, Yang-Qiu; Zhang, Hong-Jian; Tang, Lei; Wang, Ke; Zhang, Jian-Hua; Chen, Xu-Sheng; Mao, Zhong-Gui

    2014-08-01

    In order to solve the problem of extraction wastewater pollution in citric acid industry, an integrated citric acid-methane fermentation process is proposed in this study. Extraction wastewater was treated by mesophilic anaerobic digestion and then used to make mash for the next batch of citric acid fermentation. The recycling process was done for seven batches. Citric acid production (82.4 g/L on average) decreased by 34.1 % in the recycling batches (2nd-7th) compared with the first batch. And the residual reducing sugar exceeded 40 g/L on average in the recycling batches. Pigment substances, acetic acid, ammonium, and metal ions in anaerobic digestion effluent (ADE) were considered to be the inhibitors, and their effects on the fermentation were studied. Results indicated that ammonium, Na(+) and K(+) in the ADE significantly inhibited citric acid fermentation. Therefore, the ADE was treated by acidic cation exchange resin prior to reuse to make mash for citric acid fermentation. The recycling process was performed for ten batches, and citric acid productions in the recycling batches were 126.6 g/L on average, increasing by 1.7 % compared with the first batch. This process could eliminate extraction wastewater discharge and reduce water resource consumption.

  8. An ergonomics action research demonstration: integrating human factors into assembly design processes.

    Science.gov (United States)

    Village, J; Greig, M; Salustri, F; Zolfaghari, S; Neumann, W P

    2014-01-01

    In action research (AR), the researcher participates 'in' the actions in an organisation, while simultaneously reflecting 'on' the actions to promote learning for both the organisation and the researchers. This paper demonstrates a longitudinal AR collaboration with an electronics manufacturing firm where the goal was to improve the organisation's ability to integrate human factors (HF) proactively into their design processes. During the three-year collaboration, all meetings, workshops, interviews and reflections were digitally recorded and qualitatively analysed to inform new 'actions'. By the end of the collaboration, HF tools with targets and sign-off by the HF specialist were integrated into several stages of the design process, and engineers were held accountable for meeting the HF targets. We conclude that the AR approach combined with targeting multiple initiatives at different stages of the design process helped the organisation find ways to integrate HF into their processes in a sustainable way. Researchers acted as a catalyst to help integrate HF into the engineering design process in a sustainable way. This paper demonstrates how an AR approach can help achieve HF integration, the benefits of using a reflective stance and one method for reporting an AR study.

  9. The Effect of Information Security Management on Organizational Processes Integration in Supply Chain

    Directory of Open Access Journals (Sweden)

    Mohsen Shafiei Nikabadi

    2012-03-01

    Full Text Available : The major purpose of this article was that how information security management has effect on supply chain integration and the effect of implementing "information security management system" on enhancing supplies chain integration. In this respect, current research was seeking a combination overview to these tow approaches (Information Security Management and Organizational Processes Integration by Enterprise Resources Planning System and after that determined factors of these two important issue by factor analysis. Researchers using a series of comments in the automotive experts (production planning and management and supply chain experts and caregivers car makers and suppliers in the first level and second level supply chain industry. In this way, it has been done that impact on how information security management processes enterprise supply chain integration with the help of statistical correlation analysis. The results of this investigation indicated effect of "information security management system" various dimensions that were coordination of information, prevent human errors and hardware, the accuracy of information and education for users on two dimensions of internal and external integration of business processes, supply chain and finally, it can increased integration of business processes in supply chain. At the end owing to quite these results, deployment of "information security management system" increased the integration of organizational processes in supply chain. It could be demonstrate with the consideration of relation of organizational integration processes whit the level of coordination of information, prevent errors and accuracy of information throughout the supply chain.

  10. Integrated forward osmosis-membrane distillation process for human urine treatment.

    Science.gov (United States)

    Liu, Qianliang; Liu, Caihong; Zhao, Lei; Ma, Weichao; Liu, Huiling; Ma, Jun

    2016-03-15

    This study demonstrated a forward osmosis-membrane distillation (FO-MD) hybrid system for real human urine treatment. A series of NaCl solutions at different concentrations were adopted for draw solutions in FO process, which were also the feed solutions of MD process. To establish a stable and continuous integrated FO-MD system, individual FO process with different NaCl concentrations and individual direct contact membrane distillation (DCMD) process with different feed temperatures were firstly investigated separately. Four stable equilibrium conditions were obtained from matching the water transfer rates of individual FO and MD processes. It was found that the integrated system is stable and sustainable when the water transfer rate of FO subsystem is equal to that of MD subsystem. The rejections to main contaminants in human urine were also investigated. Although individual FO process had relatively high rejection to Total Organic Carbon (TOC), Total Nitrogen (TN) and Ammonium Nitrogen (NH4(+)-N) in human urine, these contaminants could also accumulate in draw solution after long term performance. The MD process provided an effective rejection to contaminants in draw solution after FO process and the integrated system revealed nearly complete rejection to TOC, TN and NH4(+)-N. This work provided a potential treatment process for human urine in some fields such as water regeneration in space station and water or nutrient recovery from source-separated urine. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. An integrated model for supplier selection process

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    In today's highly competitive manufacturing environment, the supplier selection process becomes one of crucial activities in supply chain management. In order to select the best supplier(s) it is not only necessary to continuously tracking and benchmarking performance of suppliers but also to make a tradeoff between tangible and intangible factors some of which may conflict. In this paper an integration of case-based reasoning (CBR), analytical network process (ANP) and linear programming (LP) is proposed to solve the supplier selection problem.

  12. Average thermal stress in the Al+SiC composite due to its manufacturing process

    International Nuclear Information System (INIS)

    Miranda, Carlos A.J.; Libardi, Rosani M.P.; Marcelino, Sergio; Boari, Zoroastro M.

    2013-01-01

    The numerical analyses framework to obtain the average thermal stress in the Al+SiC Composite due to its manufacturing process is presented along with the obtained results. The mixing of Aluminum and SiC powders is done at elevated temperature and the usage is at room temperature. A thermal stress state arises in the composite due to the different thermal expansion coefficients of the materials. Due to the particles size and randomness in the SiC distribution, some sets of models were analyzed and a statistical procedure used to evaluate the average stress state in the composite. In each model the particles position, form and size are randomly generated considering a volumetric ratio (VR) between 20% and 25%, close to an actual composite. The obtained stress field is represented by a certain number of iso stress curves, each one weighted by the area it represents. Systematically it was investigated the influence of: (a) the material behavior: linear x non-linear; (b) the carbide particles form: circular x quadrilateral; (c) the number of iso stress curves considered in each analysis; and (e) the model size (the number of particles). Each of above analyzed condition produced conclusions to guide the next step. Considering a confidence level of 95%, the average thermal stress value in the studied composite (20% ≤ VR ≤ 25%) is 175 MPa with a standard deviation of 10 MPa. Depending on its usage, this value should be taken into account when evaluating the material strength. (author)

  13. Integrating Leadership Processes: Redefining the Principles Course.

    Science.gov (United States)

    Neff, Bonita Dostal

    2002-01-01

    Revamps the principles of a public relations course, the first professional course in the public relations sequence, by integrating a leadership process and a service-learning component. Finds that more students are reflecting the interpersonal and team skills desired in the 1998 national study on public relations. (SG)

  14. GSMNet: A Hierarchical Graph Model for Moving Objects in Networks

    Directory of Open Access Journals (Sweden)

    Hengcai Zhang

    2017-03-01

    Full Text Available Existing data models for moving objects in networks are often limited by flexibly controlling the granularity of representing networks and the cost of location updates and do not encompass semantic information, such as traffic states, traffic restrictions and social relationships. In this paper, we aim to fill the gap of traditional network-constrained models and propose a hierarchical graph model called the Geo-Social-Moving model for moving objects in Networks (GSMNet that adopts four graph structures, RouteGraph, SegmentGraph, ObjectGraph and MoveGraph, to represent the underlying networks, trajectories and semantic information in an integrated manner. The bulk of user-defined data types and corresponding operators is proposed to handle moving objects and answer a new class of queries supporting three kinds of conditions: spatial, temporal and semantic information. Then, we develop a prototype system with the native graph database system Neo4Jto implement the proposed GSMNet model. In the experiment, we conduct the performance evaluation using simulated trajectories generated from the BerlinMOD (Berlin Moving Objects Database benchmark and compare with the mature MOD system Secondo. The results of 17 benchmark queries demonstrate that our proposed GSMNet model has strong potential to reduce time-consuming table join operations an d shows remarkable advantages with regard to representing semantic information and controlling the cost of location updates.

  15. Determining the Efficiency of Adaptation of Foreign Economic Activity of Machine-Building Enterprises in Conditions of Deepening the European Integration Process of Ukraine

    Directory of Open Access Journals (Sweden)

    Semeniuk Iryna Yu.

    2018-02-01

    Full Text Available The article determines that introduction and implementation of the mechanism for foreign economic adaptation of machine-building enterprises to the conditions of the European integration processes requires constant monitoring of the processes of export-import operations and the adaptation activities to identify current problems and avoid risks. It has been found that one of the monitoring instruments is the system of indicators, which provides to evaluate the efficiency of use of the mechanism for foreign economic adaptation of a machine-building enterprise by comparing the values of the obtained indicators after accomplishing adaptation changes with the values of the indicators of previous periods. It is suggested to determine efficiency of adaptation of foreign economic activity of machine-building enterprises to conditions of deepening of the European integration process of Ukraine by means of: index of change of volume of exported production of a machine-building enterprise to the EU countries; weighted average of the change in the share of the European market, which is covered by the enterprise’s products; indicator of efficiency of exports of production of a machine-building enterprise to the European Union countries; indicator of the index of changes in the volume of permanent orders from European partners; integral indicator of efficiency of use of adaptive potential of a machine-building enterprise in conditions of integration processes.

  16. Integrated Process Design and Control of Multi-element Reactive Distillation Processes

    DEFF Research Database (Denmark)

    Mansouri, Seyed Soheil; Sales-Cruz, Mauricio; Huusom, Jakob Kjøbsted

    2016-01-01

    In this work, integrated process design and control of reactive distillation processes involving multi-elements is presented. The reactive distillation column is designed using methods and tools which are similar in concept to non-reactive distillation design methods, such as driving force approach....... The methods employed in this work are based on equivalent element concept. This concept facilitates the representation of a multi-element reactive system as equivalent binary light and heavy key elements. First, the reactive distillation column is designed at the maximum driving force where through steady...

  17. Optimization of startup and shutdown operation of simulated moving bed chromatographic processes.

    Science.gov (United States)

    Li, Suzhou; Kawajiri, Yoshiaki; Raisch, Jörg; Seidel-Morgenstern, Andreas

    2011-06-24

    This paper presents new multistage optimal startup and shutdown strategies for simulated moving bed (SMB) chromatographic processes. The proposed concept allows to adjust transient operating conditions stage-wise, and provides capability to improve transient performance and to fulfill product quality specifications simultaneously. A specially tailored decomposition algorithm is developed to ensure computational tractability of the resulting dynamic optimization problems. By examining the transient operation of a literature separation example characterized by nonlinear competitive isotherm, the feasibility of the solution approach is demonstrated, and the performance of the conventional and multistage optimal transient regimes is evaluated systematically. The quantitative results clearly show that the optimal operating policies not only allow to significantly reduce both duration of the transient phase and desorbent consumption, but also enable on-spec production even during startup and shutdown periods. With the aid of the developed transient procedures, short-term separation campaigns with small batch sizes can be performed more flexibly and efficiently by SMB chromatography. Copyright © 2011 Elsevier B.V. All rights reserved.

  18. Velocity Mapping Toolbox (VMT): a processing and visualization suite for moving-vessel ADCP measurements

    Science.gov (United States)

    Parsons, D.R.; Jackson, P.R.; Czuba, J.A.; Engel, F.L.; Rhoads, B.L.; Oberg, K.A.; Best, J.L.; Mueller, D.S.; Johnson, K.K.; Riley, J.D.

    2013-01-01

    The use of acoustic Doppler current profilers (ADCP) for discharge measurements and three-dimensional flow mapping has increased rapidly in recent years and has been primarily driven by advances in acoustic technology and signal processing. Recent research has developed a variety of methods for processing data obtained from a range of ADCP deployments and this paper builds on this progress by describing new software for processing and visualizing ADCP data collected along transects in rivers or other bodies of water. The new utility, the Velocity Mapping Toolbox (VMT), allows rapid processing (vector rotation, projection, averaging and smoothing), visualization (planform and cross-section vector and contouring), and analysis of a range of ADCP-derived datasets. The paper documents the data processing routines in the toolbox and presents a set of diverse examples that demonstrate its capabilities. The toolbox is applicable to the analysis of ADCP data collected in a wide range of aquatic environments and is made available as open-source code along with this publication.

  19. The Role of CAD in Enterprise Integration Process

    Directory of Open Access Journals (Sweden)

    M. Ota

    2004-01-01

    Full Text Available This article deals with the problem of the mutual influence between software systems used in enterprise environment and enterprise integration processes. The position of CAD data and CAx systems in the integrated environment of manufacturing enterprises is clarified. As a consequence, the key role of CAx systems used in those companies is emphasized. It is noted that the integration of CAD data is nowadays only on a secondary level, via primarily integrated PDM systems. This limitation is a reason why we are developing a unified communication model focused on product-oriented data. Our approach is based on Internet technologies, so we believe that is independent enough. The proposed system of communication is based on a simple request-replay dialogue. The structure of this model is open and extensible, but we assume supervision supported by an Internet portal.

  20. Average geodesic distance of skeleton networks of Sierpinski tetrahedron

    Science.gov (United States)

    Yang, Jinjin; Wang, Songjing; Xi, Lifeng; Ye, Yongchao

    2018-04-01

    The average distance is concerned in the research of complex networks and is related to Wiener sum which is a topological invariant in chemical graph theory. In this paper, we study the skeleton networks of the Sierpinski tetrahedron, an important self-similar fractal, and obtain their asymptotic formula for average distances. To provide the formula, we develop some technique named finite patterns of integral of geodesic distance on self-similar measure for the Sierpinski tetrahedron.

  1. Big Data X-Learning Resources Integration and Processing in Cloud Environments

    Directory of Open Access Journals (Sweden)

    Kong Xiangsheng

    2014-09-01

    Full Text Available The cloud computing platform has good flexibility characteristics, more and more learning systems are migrated to the cloud platform. Firstly, this paper describes different types of educational environments and the data they provide. Then, it proposes a kind of heterogeneous learning resources mining, integration and processing architecture. In order to integrate and process the different types of learning resources in different educational environments, this paper specifically proposes a novel solution and massive storage integration algorithm and conversion algorithm to the heterogeneous learning resources storage and management cloud environments.

  2. The Integration Order of Vector Autoregressive Processes

    DEFF Research Database (Denmark)

    Franchi, Massimo

    We show that the order of integration of a vector autoregressive process is equal to the difference between the multiplicity of the unit root in the characteristic equation and the multiplicity of the unit root in the adjoint matrix polynomial. The equivalence with the standard I(1) and I(2...

  3. Power up your plant - An introduction to integrated process and power automation

    Energy Technology Data Exchange (ETDEWEB)

    Vasel, Jeffrey

    2010-09-15

    This paper discusses how a single integrated system can increase energy efficiency, improve plant uptime, and lower life cycle costs. Integrated Process and Power Automation is a new system integration architecture and power strategy that addresses the needs of the process and power generation industries. The architecture is based on Industrial Ethernet standards such as IEC 61850 and Profinet as well as Fieldbus technologies. The energy efficiency gains from integration are discussed in a power generation use case. A power management system success story from a major oil and gas company, Petrobras, is also discussed.

  4. Integrated HLW Conceptual Process Flowsheet(s) for the Crystalline Silicotitanate Process SRDF-98-04

    International Nuclear Information System (INIS)

    Jacobs, R.A.

    1998-01-01

    The Strategic Research and Development Fund (SRDF) provided funds to develop integrated conceptual flowsheets and material balances for a CST process as a potential replacement for, or second generation to, the ITP process. This task directly supports another SRDF task: Glass Form for HLW Sludge with CST, SRDF-98-01, by M. K. Andrews which seeks to further develop sludge/CST glasses that could be used if the ITP process were replaced by CST ion exchange. The objective of the proposal was to provide flowsheet support for development and evaluation of a High Level Waste Division process to replace ITP. The flowsheets would provide a conceptual integrated material balance showing the impact on the HLW division. The evaluation would incorporate information to be developed by Andrews and Harbour on CST/DWPF glass formulations and provide the bases for evaluating the economic impact of the proposed replacement process. Coincident with this study, the Salt Disposition Team began its evaluation of alternatives for disposition of the HLW salts in the SRS waste tanks. During that time, the CST IX process was selected as one of four alternatives (of eighteen Phase II alternatives) for further evaluation during Phase III

  5. Impact of informal institutions on the development integration processes

    Directory of Open Access Journals (Sweden)

    Sidorova Alexandra, M.

    2015-06-01

    Full Text Available The paper deals with the impact of informal institutions on the definition of the vector integration processes and the development of integration processes in the countries of the Customs Union and Ukraine. The degree of scientific development of the phenomenon in different economic schools is determined in this article. Economic mentality is a basic informal institutions, which determines the degree of effectiveness of the integration processes. This paper examines the nature, characteristics and effects of economic mentality on the economic activities of people. Ethnometrichal method allows to quantify the economic mentality that enables deeper understanding and analysis of the formation and functioning of political and economic system, especially business and management, establishing contacts with other cultures. It was measured modern Belarusian economic mentality based on international methodology Hofstede and compared with the economic mentality of Russia, Ukraine and Kazakhstan. With the help of cluster analysis congruence economic mentality of the Customs Union and Ukraine was determined. Economic mentality of these countries was also compared with the economic mentality of other countries in order to identify the main types of economic culture.

  6. Pilots' Attention Distributions Between Chasing a Moving Target and a Stationary Target.

    Science.gov (United States)

    Li, Wen-Chin; Yu, Chung-San; Braithwaite, Graham; Greaves, Matthew

    2016-12-01

    Attention plays a central role in cognitive processing; ineffective attention may induce accidents in flight operations. The objective of the current research was to examine military pilots' attention distributions between chasing a moving target and a stationary target. In the current research, 37 mission-ready F-16 pilots participated. Subjects' eye movements were collected by a portable head-mounted eye-tracker during tactical training in a flight simulator. The scenarios of chasing a moving target (air-to-air) and a stationary target (air-to-surface) consist of three operational phases: searching, aiming, and lock-on to the targets. The findings demonstrated significant differences in pilots' percentage of fixation during the searching phase between air-to-air (M = 37.57, SD = 5.72) and air-to-surface (M = 33.54, SD = 4.68). Fixation duration can indicate pilots' sustained attention to the trajectory of a dynamic target during air combat maneuvers. Aiming at the stationary target resulted in larger pupil size (M = 27,105, SD = 6565), reflecting higher cognitive loading than aiming at the dynamic target (M = 23,864, SD = 8762). Pilots' visual behavior is not only closely related to attention distribution, but also significantly associated with task characteristics. Military pilots demonstrated various visual scan patterns for searching and aiming at different types of targets based on the research settings of a flight simulator. The findings will facilitate system designers' understanding of military pilots' cognitive processes during tactical operations. They will assist human-centered interface design to improve pilots' situational awareness. The application of an eye-tracking device integrated with a flight simulator is a feasible and cost-effective intervention to improve the efficiency and safety of tactical training.Li W-C, Yu C-S, Braithwaite G, Greaves M. Pilots' attention distributions between chasing a moving target and a stationary target. Aerosp Med

  7. Move up,Move out

    Institute of Scientific and Technical Information of China (English)

    Guo Yan

    2007-01-01

    @@ China has already become the world's largest manufacturer of cement,copper and steel.Chinese producers have moved onto the world stage and dominated the global consumer market from textiles to electronics with amazing speed and efficiency.

  8. Globalization and Integration Processes in Europe

    Directory of Open Access Journals (Sweden)

    Beti Godnič

    2017-03-01

    Full Text Available Research Question (RQ: In the article we highlight the issue of whether Integration processes in the European Union are only a manifestation of these Globalization processes and if there are differences in the the old member States EU (15 and the new EU member states in changed micro and macro environment? Purpose: We wanted to determine how the old member States EU (15 and the new EU member states adapt to the new circumstances and other changes in the micro and macro environment. Method: Analysing complexity of the changes of the state of economic system, and complex fundamental global processes, which have been occurred in long period of time, need to supplement the pure scientific approach with other types of research work, more holistic approach, which is commonly used in Comparative economics. We have taken such an approach in this article. Results: In the article we studied the geopolitical changes in the micro and macro environment. We found that the development in the old EU member states EU-15 and in the new EU member states is different. EU havent addopted the harmonised economic policy which will solve the »North-South« problem and cross-state cultural consensus and find a way to operate systemically in global environment. Organization: The findings can be used to support undestanding of micro and macro envirnment of the companys and contribute for better strategic planning and design of the entire supply chain. Society: The findings can contribute to better understanding of integrative processes in the EU. Limitations/Future Research: The complexity of the problem and the dynamic changes in the functioning of the global market requires in-depth studiying of changes in the micro and macro environment of logistics companie

  9. Energy optimization of integrated process plants

    Energy Technology Data Exchange (ETDEWEB)

    Sandvig Nielsen, J

    1996-10-01

    A general approach for viewing the process synthesis as an evolutionary process is proposed. Each step is taken according to the present level of information and knowledge. This is formulated in a Process Synthesis Cycle. Initially the synthesis is conducted at a high abstraction level maximizing use of heuristics (prior experience, rules of thumbs etc). When further knowledge and information are available, heuristics will gradually be replaced by exact problem formulations. The principles in the Process Synthesis Cycle, is used to develop a general procedure for energy synthesis, based on available tools. The procedure is based on efficient use of process simulators with integrated Pinch capabilities (energy targeting). The proposed general procedure is tailored to three specific problems (Humid Air Turbine power plant synthesis, Nitric Acid process synthesis and Sulphuric Acid synthesis). Using the procedure reduces the problem dimension considerable and thus allows for faster evaluation of more alternatives. At more detailed level a new framework for the Heat Exchanger Network synthesis problem is proposed. The new framework is object oriented based on a general functional description of all elements potentially present in the heat exchanger network (streams, exchangers, pumps, furnaces etc.). (LN) 116 refs.

  10. Estimation of average hazardous-event-frequency for allocation of safety-integrity levels

    International Nuclear Information System (INIS)

    Misumi, Y.; Sato, Y.

    1999-01-01

    One of the fundamental concepts of the draft international standard, IEC 61508, is target failure measures to be allocated to Electric/Electronic/Programmable Electronic Safety-Related Systems, i.e. Safety Integrity Levels. The Safety Integrity Levels consist of four discrete probabilistic levels for specifying the safety integrity requirements or the safety functions to be allocated to Electric/Electronic/Programmable Electronic Safety-Related Systems. In order to select the Safety Integrity Levels the draft standard classifies Electric/Electronic/Programmable Electronic Safety-Related Systems into two modes of operation using demand frequencies only. It is not clear which modes of operation should be applied to Electric/Electronic/Programmable Electronic Safety-Related Systems taking into account the demand-state probability and the spurious demand frequency. It is essential for the allocation of Safety Integrity Levels that generic algorithms be derived by involving possible parameters, which make it possible to model the actuality of real systems. The present paper addresses this issue. First of all, the overall system including Electric/Electronic/programmable Electronic Safety-Related Systems is described using a simplified fault-tree. Then, the relationships among demands, demand-states and proof-tests are studied. Overall systems are classified into two groups: a non-demand-state-at-proof-test system which includes both repairable and non-repairable demand states and a constant-demand-frequency system. The new ideas such as a demand-state, spurious demand-state, mean time between detections, rates of d-failure and h-failure, and an h/d ratio are introduced in order to make the Safety Integrity Levels and modes of operation generic and comprehensive. Finally, the overall system is simplified and modeled by fault-trees using Priority-AND gates. At the same time the assumptions for modeling are described. Generic algorithms to estimate hazardous

  11. Improved Anomaly Detection using Integrated Supervised and Unsupervised Processing

    Science.gov (United States)

    Hunt, B.; Sheppard, D. G.; Wetterer, C. J.

    There are two broad technologies of signal processing applicable to space object feature identification using nonresolved imagery: supervised processing analyzes a large set of data for common characteristics that can be then used to identify, transform, and extract information from new data taken of the same given class (e.g. support vector machine); unsupervised processing utilizes detailed physics-based models that generate comparison data that can then be used to estimate parameters presumed to be governed by the same models (e.g. estimation filters). Both processes have been used in non-resolved space object identification and yield similar results yet arrived at using vastly different processes. The goal of integrating the results of the two is to seek to achieve an even greater performance by building on the process diversity. Specifically, both supervised processing and unsupervised processing will jointly operate on the analysis of brightness (radiometric flux intensity) measurements reflected by space objects and observed by a ground station to determine whether a particular day conforms to a nominal operating mode (as determined from a training set) or exhibits anomalous behavior where a particular parameter (e.g. attitude, solar panel articulation angle) has changed in some way. It is demonstrated in a variety of different scenarios that the integrated process achieves a greater performance than each of the separate processes alone.

  12. Assessing Potential Air Pollutant Emissions from Agricultural Feedstock Production using MOVES

    Energy Technology Data Exchange (ETDEWEB)

    Eberle, Annika [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Warner, Ethan [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Zhang, Yi Min [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Inman, Daniel J [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Carpenter Petri, Alberta C [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Heath, Garvin A [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Hettinger, Dylan J [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Bhatt, Arpit H [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2018-03-29

    Biomass feedstock production is expected to grow as demand for biofuels and bioenergy increases. The change in air pollutant emissions that may result from large-scale biomass supply has implications for local air quality and human health. We developed spatially explicit emissions inventories for corn grain and six cellulosic feedstocks through the extension of the National Renewable Energy Laboratory's Feedstock Production Emissions to Air Model (FPEAM). These inventories include emissions of seven pollutants (nitrogen oxides, ammonia, volatile organic compounds, particulate matter, sulfur oxides, and carbon monoxide) generated from biomass establishment, maintenance, harvest, transportation, and biofuel preprocessing activities. By integrating the EPA's MOtor Vehicle Emissions Simulator (MOVES) into FPEAM, we created a scalable framework to execute county-level runs of the MOVES-Onroad model for representative counties (i.e., those counties with the largest amount of cellulosic feedstock production in each state) on a national scale. We used these results to estimate emissions from the on-road transportation of biomass and combined them with county-level runs of the MOVES-Nonroad model to estimate emissions from agricultural equipment. We also incorporated documented emission factors to estimate emissions from chemical application and the operation of drying equipment for feedstock processing, and used methods developed by the EPA and the California Air Resources Board to estimate fugitive dust emissions. The model developed here could be applied to custom equipment budgets and is extensible to accommodate additional feedstocks and pollutants. Future work will also extend this model to analyze spatial boundaries beyond the county-scale (e.g., regional or sub-county levels).

  13. Role of moving planes and moving spheres following Dupin cyclides

    KAUST Repository

    Jia, Xiaohong

    2014-03-01

    We provide explicit representations of three moving planes that form a μ-basis for a standard Dupin cyclide. We also show how to compute μ-bases for Dupin cyclides in general position and orientation from their implicit equations. In addition, we describe the role of moving planes and moving spheres in bridging between the implicit and rational parametric representations of these cyclides. © 2014 Elsevier B.V.

  14. Role of moving planes and moving spheres following Dupin cyclides

    KAUST Repository

    Jia, Xiaohong

    2014-01-01

    We provide explicit representations of three moving planes that form a μ-basis for a standard Dupin cyclide. We also show how to compute μ-bases for Dupin cyclides in general position and orientation from their implicit equations. In addition, we describe the role of moving planes and moving spheres in bridging between the implicit and rational parametric representations of these cyclides. © 2014 Elsevier B.V.

  15. TRAX - Real-World Tracking of Moving Objects

    DEFF Research Database (Denmark)

    Jensen, Christian Søndergaard; Pakalnis, Stardas

    2007-01-01

    accuracy. This paper presents the TRAX tracking system that supports several techniques capable of tracking the current positions of moving objects with guaranteed accuracies at low update and communication costs in real-world settings. The techniques are readily relevant for practical applications......, but they also have implications for continued research. The tracking techniques offer a realistic setting for existing query processing techniques that assume that it is possible to always know the exact positions of moving objects. The techniques enable studies of trade-offs between querying and update...

  16. Managing processes and information technology in mergers - the integration of finance processes and systems

    OpenAIRE

    Pedain, Christoph

    2003-01-01

    Many companies use mergers to achieve their growth goals or target technology position. To realise synergies that justify the merger transaction, an integration of the merged companies is often necessary. Such integartion takes place across company business areas (such as finance or sales) and across the layers of management consideration, which are strategy, human resources, organisation, processes, and information technology. In merger integration techniques, there is a significant gap ...

  17. Integration of distributed plant process computer systems to nuclear power generation facilities

    International Nuclear Information System (INIS)

    Bogard, T.; Finlay, K.

    1996-01-01

    Many operating nuclear power generation facilities are replacing their plant process computer. Such replacement projects are driven by equipment obsolescence issues and associated objectives to improve plant operability, increase plant information access, improve man machine interface characteristics, and reduce operation and maintenance costs. This paper describes a few recently completed and on-going replacement projects with emphasis upon the application integrated distributed plant process computer systems. By presenting a few recent projects, the variations of distributed systems design show how various configurations can address needs for flexibility, open architecture, and integration of technological advancements in instrumentation and control technology. Architectural considerations for optimal integration of the plant process computer and plant process instrumentation ampersand control are evident from variations of design features

  18. Thermoelectric integrated membrane evaporation water recovery technology

    Science.gov (United States)

    Roebelen, G. J., Jr.; Winkler, H. E.; Dehner, G. F.

    1982-01-01

    The recently developed Thermoelectric Integrated Membrane Evaporation Subsystem (TIMES) offers a highly competitive approach to water recovery from waste fluids for future on-orbit stations such as the Space Operations Center. Low power, compactness and gravity insensitive operation are featured in this vacuum distillation subsystem that combines a hollow fiber membrane evaporator with a thermoelectric heat pump. The hollow fiber elements provide positive liquid/gas phase control with no moving parts other than pumps and an accumulator, thus solving problems inherent in other reclamation subsystem designs. In an extensive test program, over 850 hours of operation were accumulated during which time high quality product water was recovered from both urine and wash water at an average steady state production rate of 2.2 pounds per hour.

  19. Bootstrapping pre-averaged realized volatility under market microstructure noise

    DEFF Research Database (Denmark)

    Hounyo, Ulrich; Goncalves, Sílvia; Meddahi, Nour

    The main contribution of this paper is to propose a bootstrap method for inference on integrated volatility based on the pre-averaging approach of Jacod et al. (2009), where the pre-averaging is done over all possible overlapping blocks of consecutive observations. The overlapping nature of the pre......-averaged returns implies that these are kn-dependent with kn growing slowly with the sample size n. This motivates the application of a blockwise bootstrap method. We show that the "blocks of blocks" bootstrap method suggested by Politis and Romano (1992) (and further studied by Bühlmann and Künsch (1995......)) is valid only when volatility is constant. The failure of the blocks of blocks bootstrap is due to the heterogeneity of the squared pre-averaged returns when volatility is stochastic. To preserve both the dependence and the heterogeneity of squared pre-averaged returns, we propose a novel procedure...

  20. Moving event and moving participant in aspectual conceptions

    Directory of Open Access Journals (Sweden)

    Izutsu Katsunobu

    2016-06-01

    Full Text Available This study advances an analysis of the event conception of aspectual forms in four East Asian languages: Ainu, Japanese, Korean, and Ryukyuan. As earlier studies point out, event conceptions can be divided into two major types: the moving-event type and the moving-participant type, respectively. All aspectual forms in Ainu and Korean, and most forms in Japanese and Ryukyuan are based on that type of event conception. Moving-participant oriented Ainu and movingevent oriented Japanese occupy two extremes, between which Korean and Ryukyuan stand. Notwithstanding the geographical relationships among the four languages, Ryukyuan is closer to Ainu than to Korean, whereas Korean is closer to Ainu than to Japanese.

  1. Aging Effect on Audiovisual Integrative Processing in Spatial Discrimination Task

    Directory of Open Access Journals (Sweden)

    Zhi Zou

    2017-11-01

    Full Text Available Multisensory integration is an essential process that people employ daily, from conversing in social gatherings to navigating the nearby environment. The aim of this study was to investigate the impact of aging on modulating multisensory integrative processes using event-related potential (ERP, and the validity of the study was improved by including “noise” in the contrast conditions. Older and younger participants were involved in perceiving visual and/or auditory stimuli that contained spatial information. The participants responded by indicating the spatial direction (far vs. near and left vs. right conveyed in the stimuli using different wrist movements. electroencephalograms (EEGs were captured in each task trial, along with the accuracy and reaction time of the participants’ motor responses. Older participants showed a greater extent of behavioral improvements in the multisensory (as opposed to unisensory condition compared to their younger counterparts. Older participants were found to have fronto-centrally distributed super-additive P2, which was not the case for the younger participants. The P2 amplitude difference between the multisensory condition and the sum of the unisensory conditions was found to correlate significantly with performance on spatial discrimination. The results indicated that the age-related effect modulated the integrative process in the perceptual and feedback stages, particularly the evaluation of auditory stimuli. Audiovisual (AV integration may also serve a functional role during spatial-discrimination processes to compensate for the compromised attention function caused by aging.

  2. On stochastic integration for volatility modulated Brownian-driven Volterra processes via white noise analysis

    DEFF Research Database (Denmark)

    E. Barndorff-Nielsen, Ole; Benth, Fred Espen; Szozda, Benedykt

    This paper generalizes the integration theory for volatility modulated Brownian-driven Volterra processes onto the space G* of Potthoff-Timpel distributions. Sufficient conditions for integrability of generalized processes are given, regularity results and properties of the integral are discussed...

  3. On stochastic integration for volatility modulated Brownian-driven Volterra processes via white noise analysis

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole E.; Benth, Fred Espen; Szozda, Benedykt

    This paper generalizes the integration theory for volatility modulated Brownian-driven Volterra processes onto the space G∗ of Potthoff--Timpel distributions. Sufficient conditions for integrability of generalized processes are given, regularity results and properties of the integral are discusse...

  4. Effects of integrated designs of alarm and process information on diagnosis performance in digital nuclear power plants.

    Science.gov (United States)

    Wu, Xiaojun; She, Manrong; Li, Zhizhong; Song, Fei; Sang, Wei

    2017-12-01

    In the main control rooms of nuclear power plants (NPPs), operators frequently switch between alarm displays and system-information displays to incorporate information from different screens. In this study, we investigated two integrated designs of alarm and process information - integrating alarm information into process displays (denoted as Alarm2Process integration) and integrating process information into alarm displays (denoted as Process2Alarm integration). To analyse the effects of the two integration approaches and time pressure on the diagnosis performance, a laboratory experiment was conducted with ninety-six students. The results show that compared with the non-integrated case, Process2Alarm integration yields better diagnosis performance in terms of diagnosis accuracy, time required to generate correct hypothesis and completion time. In contrast, the Alarm2Process integration leads to higher levels of workload, with no improvement in diagnosis performance. The diagnosis performance of Process2Alarm integration was consistently better than that of Alarm2Process integration, regardless of the levels of time pressure. Practitioner Summary: To facilitate operator's synthesis of NPP information when performing diagnosis tasks, we proposed to integrate process information into alarm displays. The laboratory validation shows that the integration approach significantly improves the diagnosis performance for both low and high time-pressure levels.

  5. Conceptualising the management of packaging within new product development:a grounded investigation in the UK fast moving consumer goods industry

    OpenAIRE

    Simms, Chris; Trott, Paul

    2014-01-01

    Purpose- The purpose of this study is to: (i) contribute to existing models of new product development (NPD), and provide new understanding of how a new product’s packaging is managed and integrated into the NPD process of fast moving consumer goods firms; and (ii) build on prior research suggesting that firms lack a pipeline of new packaging innovations by uncovering the factors that influence this pipeline issue. Design/methodology/approach- A grounded theory methodology was adopted. Resear...

  6. Attention Modulates the Neural Processes Underlying Multisensory Integration of Emotion

    Directory of Open Access Journals (Sweden)

    Hao Tam Ho

    2011-10-01

    Full Text Available Integrating emotional information from multiple sensory modalities is generally assumed to be a pre-attentive process (de Gelder et al., 1999. This assumption, however, presupposes that the integrative process occurs independent of attention. Using event-potentials (ERP the present study investigated whether the neural processes underlying the integration of dynamic facial expression and emotional prosody is indeed unaffected by attentional manipulations. To this end, participants were presented with congruent and incongruent face-voice combinations (eg, an angry face combined with a neutral voice and performed different two-choice tasks in four consecutive blocks. Three of the tasks directed the participants' attention to emotion expressions in the face, the voice or both. The fourth task required participants to attend to the synchronicity between voice and lip movements. The results show divergent modulations of early ERP components by the different attentional manipulations. For example, when attention was directed to the face (or the voice, incongruent stimuli elicited a reduced N1 as compared to congruent stimuli. This effect was absent, when attention was diverted away from the emotionality in both face and voice suggesting that the detection of emotional incongruence already requires attention. Based on these findings, we question whether multisensory integration of emotion occurs indeed pre-attentively.

  7. Integration of European Banking and Financial Markets

    OpenAIRE

    Marques Ibanez, David; Molyneux, Philip

    2002-01-01

    This paper investigates banking and capital market developments in Europe and the moves towards the creation of a single financial services market. A critical element in the integration process is the success of the EU's Financial Services Action Plan (FSAP). This seeks to introduce a wide range of legislation aimed at reducing barriers and promoting cross-border trade in financial services - especially for capital markets and retail / SME financial service areas. As was the case in 1992, it ...

  8. Fast generation of video holograms of three-dimensional moving objects using a motion compensation-based novel look-up table.

    Science.gov (United States)

    Kim, Seung-Cheol; Dong, Xiao-Bin; Kwon, Min-Woo; Kim, Eun-Soo

    2013-05-06

    A novel approach for fast generation of video holograms of three-dimensional (3-D) moving objects using a motion compensation-based novel-look-up-table (MC-N-LUT) method is proposed. Motion compensation has been widely employed in compression of conventional 2-D video data because of its ability to exploit high temporal correlation between successive video frames. Here, this concept of motion-compensation is firstly applied to the N-LUT based on its inherent property of shift-invariance. That is, motion vectors of 3-D moving objects are extracted between the two consecutive video frames, and with them motions of the 3-D objects at each frame are compensated. Then, through this process, 3-D object data to be calculated for its video holograms are massively reduced, which results in a dramatic increase of the computational speed of the proposed method. Experimental results with three kinds of 3-D video scenarios reveal that the average number of calculated object points and the average calculation time for one object point of the proposed method, have found to be reduced down to 86.95%, 86.53% and 34.99%, 32.30%, respectively compared to those of the conventional N-LUT and temporal redundancy-based N-LUT (TR-N-LUT) methods.

  9. An Enhanced Error Model for EKF-Based Tightly-Coupled Integration of GPS and Land Vehicle's Motion Sensors.

    Science.gov (United States)

    Karamat, Tashfeen B; Atia, Mohamed M; Noureldin, Aboelmagd

    2015-09-22

    Reduced inertial sensor systems (RISS) have been introduced by many researchers as a low-cost, low-complexity sensor assembly that can be integrated with GPS to provide a robust integrated navigation system for land vehicles. In earlier works, the developed error models were simplified based on the assumption that the vehicle is mostly moving on a flat horizontal plane. Another limitation is the simplified estimation of the horizontal tilt angles, which is based on simple averaging of the accelerometers' measurements without modelling their errors or tilt angle errors. In this paper, a new error model is developed for RISS that accounts for the effect of tilt angle errors and the accelerometer's errors. Additionally, it also includes important terms in the system dynamic error model, which were ignored during the linearization process in earlier works. An augmented extended Kalman filter (EKF) is designed to incorporate tilt angle errors and transversal accelerometer errors. The new error model and the augmented EKF design are developed in a tightly-coupled RISS/GPS integrated navigation system. The proposed system was tested on real trajectories' data under degraded GPS environments, and the results were compared to earlier works on RISS/GPS systems. The findings demonstrated that the proposed enhanced system introduced significant improvements in navigational performance.

  10. Process modeling for the Integrated Nonthermal Treatment System (INTS) study

    Energy Technology Data Exchange (ETDEWEB)

    Brown, B.W.

    1997-04-01

    This report describes the process modeling done in support of the Integrated Nonthermal Treatment System (INTS) study. This study was performed to supplement the Integrated Thermal Treatment System (ITTS) study and comprises five conceptual treatment systems that treat DOE contract-handled mixed low-level wastes (MLLW) at temperatures of less than 350{degrees}F. ASPEN PLUS, a chemical process simulator, was used to model the systems. Nonthermal treatment systems were developed as part of the INTS study and include sufficient processing steps to treat the entire inventory of MLLW. The final result of the modeling is a process flowsheet with a detailed mass and energy balance. In contrast to the ITTS study, which modeled only the main treatment system, the INTS study modeled each of the various processing steps with ASPEN PLUS, release 9.1-1. Trace constituents, such as radionuclides and minor pollutant species, were not included in the calculations.

  11. Journal of Earth System Science | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    To overcome this shortcoming, monthly inflow is predicted in this study based on a combination of seasonal autoregressive integrated moving average (SARIMA) andgene expression programming (GEP) models, which is a new hybrid method (SARIMA–GEP). To this end, a four-step process is employed. First, the monthly ...

  12. The concept of the average stress in the fracture process zone for the search of the crack path

    Directory of Open Access Journals (Sweden)

    Yu.G. Matvienko

    2015-10-01

    Full Text Available The concept of the average stress has been employed to propose the maximum average tangential stress (MATS criterion for predicting the direction of fracture angle. This criterion states that a crack grows when the maximum average tangential stress in the fracture process zone ahead of the crack tip reaches its critical value and the crack growth direction coincides with the direction of the maximum average tangential stress along a constant radius around the crack tip. The tangential stress is described by the singular and nonsingular (T-stress terms in the Williams series solution. To demonstrate the validity of the proposed MATS criterion, this criterion is directly applied to experiments reported in the literature for the mixed mode I/II crack growth behavior of Guiting limestone. The predicted directions of fracture angle are consistent with the experimental data. The concept of the average stress has been also employed to predict the surface crack path under rolling-sliding contact loading. The proposed model considers the size and orientation of the initial crack, normal and tangential loading due to rolling–sliding contact as well as the influence of fluid trapped inside the crack by a hydraulic pressure mechanism. The MATS criterion is directly applied to equivalent contact model for surface crack growth on a gear tooth flank.

  13. Working memory moderates the effect of the integrative process of implicit and explicit autonomous motivation on academic achievement.

    Science.gov (United States)

    Gareau, Alexandre; Gaudreau, Patrick

    2017-11-01

    In previous research, autonomous motivation (AM) has been found to be associated with school achievement, but the relation has been largely heterogeneous across studies. AM has typically been assessed with explicit measures such as self-report questionnaires. Recent self-determination theory (SDT) research has suggested that converging implicit and explicit measures can be taken to characterize the integrative process in SDT. Drawing from dual-process theories, we contended that explicit AM is likely to promote school achievement when it is part of an integrated cognitive system that combines easily accessible mental representations (i.e., implicit AM) and efficient executive functioning. A sample of 272 university students completed a questionnaire and a lexical decision task to assess their explicit and implicit AM, respectively, and they also completed working memory capacity measures. Grades were obtained at the end of the semester to examine the short-term prospective effect of implicit and explicit AM, working memory, and their interaction. Results of moderation analyses have provided support for a synergistic interaction in which the association between explicit AM and academic achievement was positive and significant only for individuals with high level of implicit AM. Moreover, working memory was moderating the synergistic effect of explicit and implicit AM. Explicit AM was positively associated with academic achievement for students with average-to-high levels of working memory capacity, but only if their motivation operated synergistically with high implicit AM. The integrative process thus seems to hold better proprieties for achievement than the sole effect of explicit AM. Implications for SDT are outlined. © 2017 The British Psychological Society.

  14. Integrated management of information inside maintenance processes. From the building registry to BIM systems

    Directory of Open Access Journals (Sweden)

    Cinzia Talamo

    2014-10-01

    Full Text Available The paper presents objec- tives, methods and results of two researches dealing with the improvement of integrated information management within maintenance processes. Focusing on information needs regarding the last phases of the building process, the two researches draft approaches characterizing a path of progressive improve- ment of strategies for integration: from a building registry, unique for the whole construction process, to an integrated management of the building process with the support of BIM systems.

  15. Test processing integrated system (S.I.D.E.X.)

    International Nuclear Information System (INIS)

    Sabas, M.; Oules, H.; Badel, D.

    1969-01-01

    The Test Processing Integrated System is mostly composed of a CAE 9080 (equiv. S. D. S. 9300) computer which is equipped of a 100 000 samples/sec acquisition system. The System is designed for high speed data acquisition and data processing on environment tests, and also calculation of structural models. Such a digital appliance on data processing has many advantages compared to the conventional methods based on analog instruments. (author) [fr

  16. Hybrid Heat Capacity - Moving Slab Laser Concept

    International Nuclear Information System (INIS)

    Stappaerts, E A

    2002-01-01

    A hybrid configuration of a heat capacity laser (HCL) and a moving slab laser (MSL) has been studied. Multiple volumes of solid-state laser material are sequentially diode-pumped and their energy extracted. When a volume reaches a maximum temperature after a ''sub-magazine depth'', it is moved out of the pumping region into a cooling region, and a new volume is introduced. The total magazine depth equals the submagazine depth times the number of volumes. The design parameters are chosen to provide high duty factor operation, resulting in effective use of the diode arrays. The concept significantly reduces diode array cost over conventional heat capacity lasers, and it is considered enabling for many potential applications. A conceptual design study of the hybrid configuration has been carried out. Three concepts were evaluated using CAD tools. The concepts are described and their relative merits discussed. Because of reduced disk size and diode cost, the hybrid concept may allow scaling to average powers on the order of 0.5 MW/module

  17. EUROPEAN INTEGRATION: A MULTILEVEL PROCESS THAT REQUIRES A MULTILEVEL STATISTICAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Roxana-Otilia-Sonia HRITCU

    2015-11-01

    Full Text Available A process of market regulation and a system of multi-level governance and several supranational, national and subnational levels of decision making, European integration subscribes to being a multilevel phenomenon. The individual characteristics of citizens, as well as the environment where the integration process takes place, are important. To understand the European integration and its consequences it is important to develop and test multi-level theories that consider individual-level characteristics, as well as the overall context where individuals act and express their characteristics. A central argument of this paper is that support for European integration is influenced by factors operating at different levels. We review and present theories and related research on the use of multilevel analysis in the European area. This paper draws insights on various aspects and consequences of the European integration to take stock of what we know about how and why to use multilevel modeling.

  18. A Moving-Object Index for Efficient Query Processing with PeerWise Location Privacy

    DEFF Research Database (Denmark)

    Lin, Dan; Jensen, Christian S.; Zhang, Rui

    2011-01-01

    attention has been paid to enabling so-called peer-wise privacy—the protection of a user’s location from unauthorized peer users. This paper identifies an important efficiency problem in existing peer-privacy approaches that simply apply a filtering step to identify users that are located in a query range......, but that do not want to disclose their location to the querying peer. To solve this problem, we propose a novel, privacy-policy enabled index called the PEB-tree that seamlessly integrates location proximity and policy compatibility. We propose efficient algorithms that use the PEB-tree for processing privacy......-aware range and kNN queries. Extensive experiments suggest that the PEB-tree enables efficient query processing....

  19. The Conceptual Framework for Ensuring Economic Safety of Corporate Integration Processes

    Directory of Open Access Journals (Sweden)

    Gutsaliuk Oleksii M.

    2016-08-01

    Full Text Available The objective growth of the number of displays and influence of negative factors of threats from the environment actualizes the issue of ensuring economic safety of national economic entities. The article notes that simultaneously with counteracting threats enterprises are working for development, one form of which is the establishment of corporate structures and implementation of integration processes. It is proposed to ensure achieving the desired level of the corporate structure economic safety through optimizing the correlation of resources and competencies, skills and technologies for their use within the integrated logistics value chain. In this case it is the implementation of the integration process that serves as an instrument for achieving this optimal correlation, and the level of economic safety is considered as one of the optimization criteria. The system of authors’ hypotheses is taken as the basis for ensuring economic safety of the corporate integration process. Each of the hypotheses corresponds to a set of conceptual principles aimed at practical implementation of the proposed approaches. Within these conceptual principles the relationship between incentives and benefits of integration and the basis for ensuring their safety is presented, the differences between safety of functioning and safety of development are studied, the use of the methodology of logistics to harmonize the interests of participants of the corporate structure is justified, the relevance of applying the resource approach to manage the integration and development safety is proved. The graphical representation of causal relationships between the proposed conceptual principles allowed formalizing the subject area of studying corporate integration safety

  20. Striking the right chord: moving music increases psychological transportation and behavioral intentions.

    Science.gov (United States)

    Strick, Madelijn; de Bruin, Hanka L; de Ruiter, Linde C; Jonkers, Wouter

    2015-03-01

    Three experiments among university students (N = 372) investigated the persuasive power of moving (i.e., intensely emotional and "chills"-evoking) music in audio-visual advertising. Although advertisers typically aim to increase elaborate processing of the message, these studies illustrate that the persuasive effect of moving music is based on increased narrative transportation ("getting lost" in the ad's story), which reduces critical processing. In Experiment 1, moving music increased transportation and some behavioral intentions (e.g., to donate money). Experiment 2 experimentally increased the salience of manipulative intent of the advertiser, and showed that moving music reduces inferences of manipulative intent, leading in turn to increased behavioral intentions. Experiment 3 tested boundary effects, and showed that moving music fails to increase behavioral intentions when the salience of manipulative intent is either extremely high (which precludes transportation) or extremely low (which precludes reduction of inferences of manipulative intent). Moving music did not increase memory performance, beliefs, and explicit attitudes, suggesting that the influence is affect-based rather cognition-based. Together, these studies illustrate that moving music reduces inferences of manipulation and increases behavioral intentions by transporting viewers into the story of the ad. PsycINFO Database Record (c) 2015 APA, all rights reserved.