DEFF Research Database (Denmark)
Silvennoinen, Annastiina; Teräsvirta, Timo
This article contains a review of multivariate GARCH models. Most common GARCH models are presented and their properties considered. This also includes nonparametric and semiparametric models. Existing specification and misspecification tests are discussed. Finally, there is an empirical example...... in which several multivariate GARCH models are fitted to the same data set and the results compared....
DEFF Research Database (Denmark)
Silvennoinen, Annastiina; Teräsvirta, Timo
This article contains a review of multivariate GARCH models. Most common GARCH models are presented and their properties considered. This also includes nonparametric and semiparametric models. Existing specification and misspecification tests are discussed. Finally, there is an empirical example...... in which several multivariate GARCH models are fitted to the same data set and the results compared....
Multivariate Modelling via Matrix Subordination
DEFF Research Database (Denmark)
Nicolato, Elisa
stochastic volatility via time-change is quite ineffective when applied to the multivariate setting. In this work we propose a new class of models, which is obtained by conditioning a multivariate Brownian Motion to a so-called matrix subordinator. The obtained model-class encompasses the vast majority...
Multivariate covariance generalized linear models
DEFF Research Database (Denmark)
Bonat, W. H.; Jørgensen, Bent
2016-01-01
We propose a general framework for non-normal multivariate data analysis called multivariate covariance generalized linear models, designed to handle multivariate response variables, along with a wide range of temporal and spatial correlation structures defined in terms of a covariance link...... function combined with a matrix linear predictor involving known matrices. The method is motivated by three data examples that are not easily handled by existing methods. The first example concerns multivariate count data, the second involves response variables of mixed types, combined with repeated...... are fitted by using an efficient Newton scoring algorithm based on quasi-likelihood and Pearson estimating functions, using only second-moment assumptions. This provides a unified approach to a wide variety of types of response variables and covariance structures, including multivariate extensions...
Multivariate Modelling via Matrix Subordination
DEFF Research Database (Denmark)
Nicolato, Elisa
Extending the vast library of univariate models to price multi-asset derivatives is still a challenge in the field of Quantitative Finance. Within the literature on multivariate modelling, a dichotomy may be noticed. On one hand, the focus has been on the construction of models displaying...... stochastic correlation within the framework of discussion processes (see e.g. Pigorsh and Stelzer (2008), Hubalek and Nicolato (2008) and Zhu (2000)). On the other hand a number of authors have proposed multivariate Levy models, which allow for flexible modelling of returns, but at the expenses of a constant...... correlation structure (see e.g. Leoni and Schoutens (2007) and Leoni and Schoutens (2007) among others). Tractable multivariate models displaying flexible and stochastic correlation structures combined with jumps is proving to be rather problematic. In particular, the classical technique of introducing...
Preliminary Multivariable Cost Model for Space Telescopes
Stahl, H. Philip
2010-01-01
Parametric cost models are routinely used to plan missions, compare concepts and justify technology investments. Previously, the authors published two single variable cost models based on 19 flight missions. The current paper presents the development of a multi-variable space telescopes cost model. The validity of previously published models are tested. Cost estimating relationships which are and are not significant cost drivers are identified. And, interrelationships between variables are explored
Sparse Linear Identifiable Multivariate Modeling
DEFF Research Database (Denmark)
Henao, Ricardo; Winther, Ole
2011-01-01
In this paper we consider sparse and identifiable linear latent variable (factor) and linear Bayesian network models for parsimonious analysis of multivariate data. We propose a computationally efficient method for joint parameter and model inference, and model comparison. It consists of a fully...... Bayesian hierarchy for sparse models using slab and spike priors (two-component δ-function and continuous mixtures), non-Gaussian latent factors and a stochastic search over the ordering of the variables. The framework, which we call SLIM (Sparse Linear Identifiable Multivariate modeling), is validated...... and bench-marked on artificial and real biological data sets. SLIM is closest in spirit to LiNGAM (Shimizu et al., 2006), but differs substantially in inference, Bayesian network structure learning and model comparison. Experimentally, SLIM performs equally well or better than LiNGAM with comparable...
Multivariable modeling and multivariate analysis for the behavioral sciences
Everitt, Brian S
2009-01-01
Multivariable Modeling and Multivariate Analysis for the Behavioral Sciences shows students how to apply statistical methods to behavioral science data in a sensible manner. Assuming some familiarity with introductory statistics, the book analyzes a host of real-world data to provide useful answers to real-life issues.The author begins by exploring the types and design of behavioral studies. He also explains how models are used in the analysis of data. After describing graphical methods, such as scatterplot matrices, the text covers simple linear regression, locally weighted regression, multip
Model Checking Multivariate State Rewards
DEFF Research Database (Denmark)
Nielsen, Bo Friis; Nielson, Flemming; Nielson, Hanne Riis
2010-01-01
We consider continuous stochastic logics with state rewards that are interpreted over continuous time Markov chains. We show how results from multivariate phase type distributions can be used to obtain higher-order moments for multivariate state rewards (including covariance). We also generalise ...... the treatment of eventuality to unbounded path formulae. For all extensions we show how to obtain closed form definitions that are straightforward to implement and we illustrate our development on a small example.......We consider continuous stochastic logics with state rewards that are interpreted over continuous time Markov chains. We show how results from multivariate phase type distributions can be used to obtain higher-order moments for multivariate state rewards (including covariance). We also generalise...
Meléndez, E; Ortiz, M C; Sarabia, L A; Íñiguez, M; Puras, P
2013-01-25
The ripeness of grapes at the harvest time is one of the most important parameters for obtaining high quality red wines. Traditionally the decision of harvesting is to be taken only after analysing sugar concentration, titratable acidity and pH of the grape juice (technological maturity). However, these parameters only provide information about the pulp ripeness and overlook the real degree of skins and seeds maturities (phenolic maturity). Both maturities, technological and phenolic, are not simultaneously reached, on the contrary they tend to separate depending on several factors: grape variety, cultivar, adverse weather conditions, soil, water availability and cultural practices. Besides, this divergence is increasing as a consequence of the climate change (larger quantities of CO(2), less rain, and higher temperatures). 247 samples collected in vineyards representative of the qualified designation of origin Rioja from 2007 to 2011 have been analysed. Samples contain the four grape varieties usual in the elaboration of Rioja wines ('tempranillo', 'garnacha', 'mazuelo' and 'graciano'). The present study is the first systematic investigation on the maturity of grapes that includes the organoleptic evaluation of the degree of grapes maturity (sugars/acidity maturity, aromatic maturity of the pulp, aromatic maturity of the skins and tannins maturity) together with the values of the physicochemical parameters (probable alcohol degree, total acidity, pH, malic acid, K, total index polyphenolics, anthocyans, absorbances at 420, 520 and 620 nm, colour index and tartaric acid) determined over the same samples. A varimax rotation of the latent variables of a PLS model between the physicochemical variables and the mean of four sensory variables allows identifying both maturities. Besides, the position of the samples in the first plane defines the effect that the different factors exert on both phenolic and technological maturities. Copyright © 2012 Elsevier B.V. All
Multivariate pluvial flood damage models
Energy Technology Data Exchange (ETDEWEB)
Van Ootegem, Luc [HIVA — University of Louvain (Belgium); SHERPPA — Ghent University (Belgium); Verhofstadt, Elsy [SHERPPA — Ghent University (Belgium); Van Herck, Kristine; Creten, Tom [HIVA — University of Louvain (Belgium)
2015-09-15
Depth–damage-functions, relating the monetary flood damage to the depth of the inundation, are commonly used in the case of fluvial floods (floods caused by a river overflowing). We construct four multivariate damage models for pluvial floods (caused by extreme rainfall) by differentiating on the one hand between ground floor floods and basement floods and on the other hand between damage to residential buildings and damage to housing contents. We do not only take into account the effect of flood-depth on damage, but also incorporate the effects of non-hazard indicators (building characteristics, behavioural indicators and socio-economic variables). By using a Tobit-estimation technique on identified victims of pluvial floods in Flanders (Belgium), we take into account the effect of cases of reported zero damage. Our results show that the flood depth is an important predictor of damage, but with a diverging impact between ground floor floods and basement floods. Also non-hazard indicators are important. For example being aware of the risk just before the water enters the building reduces content damage considerably, underlining the importance of warning systems and policy in this case of pluvial floods. - Highlights: • Prediction of damage of pluvial floods using also non-hazard information • We include ‘no damage cases’ using a Tobit model. • The damage of flood depth is stronger for ground floor than for basement floods. • Non-hazard indicators are especially important for content damage. • Potential gain of policies that increase awareness of flood risks.
Building multivariate systems biology models
Kirwan, G.M.; Johansson, E.; Kleemann, R.; Verheij, E.R.; Wheelock, A.M.; Goto, S.; Trygg, J.; Wheelock, C.E.
2012-01-01
Systems biology methods using large-scale "omics" data sets face unique challenges: integrating and analyzing near limitless data space, while recognizing and removing systematic variation or noise. Herein we propose a complementary multivariate analysis workflow to both integrate "omics" data from
Multivariate generalized linear mixed models using R
Berridge, Damon Mark
2011-01-01
Multivariate Generalized Linear Mixed Models Using R presents robust and methodologically sound models for analyzing large and complex data sets, enabling readers to answer increasingly complex research questions. The book applies the principles of modeling to longitudinal data from panel and related studies via the Sabre software package in R. A Unified Framework for a Broad Class of Models The authors first discuss members of the family of generalized linear models, gradually adding complexity to the modeling framework by incorporating random effects. After reviewing the generalized linear model notation, they illustrate a range of random effects models, including three-level, multivariate, endpoint, event history, and state dependence models. They estimate the multivariate generalized linear mixed models (MGLMMs) using either standard or adaptive Gaussian quadrature. The authors also compare two-level fixed and random effects linear models. The appendices contain additional information on quadrature, model...
The value of multivariate model sophistication
DEFF Research Database (Denmark)
Rombouts, Jeroen; Stentoft, Lars; Violante, Francesco
2014-01-01
We assess the predictive accuracies of a large number of multivariate volatility models in terms of pricing options on the Dow Jones Industrial Average. We measure the value of model sophistication in terms of dollar losses by considering a set of 444 multivariate models that differ in their spec....... In addition to investigating the value of model sophistication in terms of dollar losses directly, we also use the model confidence set approach to statistically infer the set of models that delivers the best pricing performances....
Sparse Linear Identifiable Multivariate Modeling
DEFF Research Database (Denmark)
Henao, Ricardo; Winther, Ole
2011-01-01
and bench-marked on artificial and real biological data sets. SLIM is closest in spirit to LiNGAM (Shimizu et al., 2006), but differs substantially in inference, Bayesian network structure learning and model comparison. Experimentally, SLIM performs equally well or better than LiNGAM with comparable...
Ranking multivariate GARCH models by problem dimension
M. Caporin (Massimiliano); M.J. McAleer (Michael)
2010-01-01
textabstractIn the last 15 years, several Multivariate GARCH (MGARCH) models have appeared in the literature. The two most widely known and used are the Scalar BEKK model of Engle and Kroner (1995) and Ding and Engle (2001), and the DCC model of Engle (2002). Some recent research has begun to examin
Multivariate Model for Test Response Analysis
Krishnan, Shaji; Krishnan, Shaji; Kerkhoff, Hans G.
2010-01-01
A systematic approach to construct an effective multivariate test response model for capturing manufacturing defects in electronic products is described. The effectiveness of the model is demonstrated by its capability in reducing the number of test-points, while achieving the maximal coverage
Multivariate model for test response analysis
Krishnan, S.; Kerkhoff, H.G.
2010-01-01
A systematic approach to construct an effective multivariate test response model for capturing manufacturing defects in electronic products is described. The effectiveness of the model is demonstrated by its capability in reducing the number of test-points, while achieving the maximal coverage attai
Multivariate model for test response analysis
Krishnan, S.; Kerkhoff, H.G.
2010-01-01
A systematic approach to construct an effective multivariate test response model for capturing manufacturing defects in electronic products is described. The effectiveness of the model is demonstrated by its capability in reducing the number of test-points, while achieving the maximal coverage attai
Modeling Multivariate Volatility Processes: Theory and Evidence
Directory of Open Access Journals (Sweden)
Jelena Z. Minovic
2009-05-01
Full Text Available This article presents theoretical and empirical methodology for estimation and modeling of multivariate volatility processes. It surveys the model specifications and the estimation methods. Multivariate GARCH models covered are VEC (initially due to Bollerslev, Engle and Wooldridge, 1988, diagonal VEC (DVEC, BEKK (named after Baba, Engle, Kraft and Kroner, 1995, Constant Conditional Correlation Model (CCC, Bollerslev, 1990, Dynamic Conditional Correlation Model (DCC models of Tse and Tsui, 2002, and Engle, 2002. I illustrate approach by applying it to daily data from the Belgrade stock exchange, I examine two pairs of daily log returns for stocks and index, report the results obtained, and compare them with the restricted version of BEKK, DVEC and CCC representations. The methods for estimation parameters used are maximum log-likehood (in BEKK and DVEC models and twostep approach (in CCC model.
A Multivariate Approach to Functional Neuro Modeling
DEFF Research Database (Denmark)
Mørch, Niels J.S.
1998-01-01
This Ph.D. thesis, A Multivariate Approach to Functional Neuro Modeling, deals with the analysis and modeling of data from functional neuro imaging experiments. A multivariate dataset description is provided which facilitates efficient representation of typical datasets and, more importantly...... and overall conditions governing the functional experiment, via associated micro- and macroscopic variables. The description facilitates an efficient microscopic re-representation, as well as a handle on the link between brain and behavior; the latter is achieved by hypothesizing variations in the micro...... a generalization theoretical framework centered around measures of model generalization error. - Only few, if any, examples of the application of generalization theory to functional neuro modeling currently exist in the literature. - Exemplification of the proposed generalization theoretical framework...
Shared Frailty Model for Left-Truncated Multivariate Survival Data
DEFF Research Database (Denmark)
Jensen, Henrik; Brookmeyer, Ron; Aaby, Peter;
multivariate survival data, left truncation, multiplicative hazard model, shared gamma frailty, conditional model, piecewise exponential model, childhood survival......multivariate survival data, left truncation, multiplicative hazard model, shared gamma frailty, conditional model, piecewise exponential model, childhood survival...
Multivariate Generalized Linear Mixed Models Using R
Berridge, Damon M
2011-01-01
To provide researchers with the ability to analyze large and complex data sets using robust models, this book presents a unified framework for a broad class of models that can be applied using a dedicated R package (Sabre). The first five chapters cover the analysis of multilevel models using univariate generalized linear mixed models (GLMMs). The next few chapters extend to multivariate GLMMs and the last chapters address more specialized topics, such as parallel computing for large-scale analyses. Each chapter includes many real-world examples implemented using Sabre as well as exercises and
Modelling and Forecasting Multivariate Realized Volatility
DEFF Research Database (Denmark)
Chiriac, Roxana; Voev, Valeri
This paper proposes a methodology for modelling time series of realized covariance matrices in order to forecast multivariate risks. The approach allows for flexible dynamic dependence patterns and guarantees positive definiteness of the resulting forecasts without imposing parameter restrictions....... We provide an empirical application of the model, in which we show by means of stochastic dominance tests that the returns from an optimal portfolio based on the model's forecasts second-order dominate returns of portfolios optimized on the basis of traditional MGARCH models. This result implies...
"Ranking Multivariate GARCH Models by Problem Dimension"
Caporin, Massimiliano; McAleer, Michael
2010-01-01
textabstractIn the last 15 years, several Multivariate GARCH (MGARCH) models have appeared in the literature. The two most widely known and used are the Scalar BEKK model of Engle and Kroner (1995) and Ding and Engle (2001), and the DCC model of Engle (2002). Some recent research has begun to examine MGARCH specifications in terms of their out-of-sample forecasting performance. In this paper, we provide an empirical comparison of a set of MGARCH models, namely BEKK, DCC, Corrected DCC (cDCC) ...
Bayesian Inference of a Multivariate Regression Model
Directory of Open Access Journals (Sweden)
Marick S. Sinay
2014-01-01
Full Text Available We explore Bayesian inference of a multivariate linear regression model with use of a flexible prior for the covariance structure. The commonly adopted Bayesian setup involves the conjugate prior, multivariate normal distribution for the regression coefficients and inverse Wishart specification for the covariance matrix. Here we depart from this approach and propose a novel Bayesian estimator for the covariance. A multivariate normal prior for the unique elements of the matrix logarithm of the covariance matrix is considered. Such structure allows for a richer class of prior distributions for the covariance, with respect to strength of beliefs in prior location hyperparameters, as well as the added ability, to model potential correlation amongst the covariance structure. The posterior moments of all relevant parameters of interest are calculated based upon numerical results via a Markov chain Monte Carlo procedure. The Metropolis-Hastings-within-Gibbs algorithm is invoked to account for the construction of a proposal density that closely matches the shape of the target posterior distribution. As an application of the proposed technique, we investigate a multiple regression based upon the 1980 High School and Beyond Survey.
Multivariate Markov chain modeling for stock markets
Maskawa, Jun-ichi
2003-06-01
We study a multivariate Markov chain model as a stochastic model of the price changes of portfolios in the framework of the mean field approximation. The time series of price changes are coded into the sequences of up and down spins according to their signs. We start with the discussion for small portfolios consisting of two stock issues. The generalization of our model to arbitrary size of portfolio is constructed by a recurrence relation. The resultant form of the joint probability of the stationary state coincides with Gibbs measure assigned to each configuration of spin glass model. Through the analysis of actual portfolios, it has been shown that the synchronization of the direction of the price changes is well described by the model.
Adaptable Multivariate Calibration Models for Spectral Applications
Energy Technology Data Exchange (ETDEWEB)
THOMAS,EDWARD V.
1999-12-20
Multivariate calibration techniques have been used in a wide variety of spectroscopic situations. In many of these situations spectral variation can be partitioned into meaningful classes. For example, suppose that multiple spectra are obtained from each of a number of different objects wherein the level of the analyte of interest varies within each object over time. In such situations the total spectral variation observed across all measurements has two distinct general sources of variation: intra-object and inter-object. One might want to develop a global multivariate calibration model that predicts the analyte of interest accurately both within and across objects, including new objects not involved in developing the calibration model. However, this goal might be hard to realize if the inter-object spectral variation is complex and difficult to model. If the intra-object spectral variation is consistent across objects, an effective alternative approach might be to develop a generic intra-object model that can be adapted to each object separately. This paper contains recommendations for experimental protocols and data analysis in such situations. The approach is illustrated with an example involving the noninvasive measurement of glucose using near-infrared reflectance spectroscopy. Extensions to calibration maintenance and calibration transfer are discussed.
Modelling lifetime data with multivariate Tweedie distribution
Nor, Siti Rohani Mohd; Yusof, Fadhilah; Bahar, Arifah
2017-05-01
This study aims to measure the dependence between individual lifetimes by applying multivariate Tweedie distribution to the lifetime data. Dependence between lifetimes incorporated in the mortality model is a new form of idea that gives significant impact on the risk of the annuity portfolio which is actually against the idea of standard actuarial methods that assumes independent between lifetimes. Hence, this paper applies Tweedie family distribution to the portfolio of lifetimes to induce the dependence between lives. Tweedie distribution is chosen since it contains symmetric and non-symmetric, as well as light-tailed and heavy-tailed distributions. Parameter estimation is modified in order to fit the Tweedie distribution to the data. This procedure is developed by using method of moments. In addition, the comparison stage is made to check for the adequacy between the observed mortality and expected mortality. Finally, the importance of including systematic mortality risk in the model is justified by the Pearson's chi-squared test.
Measuring equilibrium models: a multivariate approach
Directory of Open Access Journals (Sweden)
Nadji RAHMANIA
2011-04-01
Full Text Available This paper presents a multivariate methodology for obtaining measures of unobserved macroeconomic variables. The used procedure is the multivariate Hodrick-Prescot which depends on smoothing param eters. The choice of these parameters is crucial. Our approach is based on consistent estimators of these parameters, depending only on the observed data.
MULTIVARIATE VARYING COEFFICIENT MODEL FOR FUNCTIONAL RESPONSES.
Zhu, Hongtu; Li, Runze; Kong, Linglong
2012-10-01
Motivated by recent work studying massive imaging data in the neuroimaging literature, we propose multivariate varying coefficient models (MVCM) for modeling the relation between multiple functional responses and a set of covariates. We develop several statistical inference procedures for MVCM and systematically study their theoretical properties. We first establish the weak convergence of the local linear estimate of coefficient functions, as well as its asymptotic bias and variance, and then we derive asymptotic bias and mean integrated squared error of smoothed individual functions and their uniform convergence rate. We establish the uniform convergence rate of the estimated covariance function of the individual functions and its associated eigenvalue and eigenfunctions. We propose a global test for linear hypotheses of varying coefficient functions, and derive its asymptotic distribution under the null hypothesis. We also propose a simultaneous confidence band for each individual effect curve. We conduct Monte Carlo simulation to examine the finite-sample performance of the proposed procedures. We apply MVCM to investigate the development of white matter diffusivities along the genu tract of the corpus callosum in a clinical study of neurodevelopment.
Multivariate models of adult Pacific salmon returns.
Burke, Brian J; Peterson, William T; Beckman, Brian R; Morgan, Cheryl; Daly, Elizabeth A; Litz, Marisa
2013-01-01
Most modeling and statistical approaches encourage simplicity, yet ecological processes are often complex, as they are influenced by numerous dynamic environmental and biological factors. Pacific salmon abundance has been highly variable over the last few decades and most forecasting models have proven inadequate, primarily because of a lack of understanding of the processes affecting variability in survival. Better methods and data for predicting the abundance of returning adults are therefore required to effectively manage the species. We combined 31 distinct indicators of the marine environment collected over an 11-year period into a multivariate analysis to summarize and predict adult spring Chinook salmon returns to the Columbia River in 2012. In addition to forecasts, this tool quantifies the strength of the relationship between various ecological indicators and salmon returns, allowing interpretation of ecosystem processes. The relative importance of indicators varied, but a few trends emerged. Adult returns of spring Chinook salmon were best described using indicators of bottom-up ecological processes such as composition and abundance of zooplankton and fish prey as well as measures of individual fish, such as growth and condition. Local indicators of temperature or coastal upwelling did not contribute as much as large-scale indicators of temperature variability, matching the spatial scale over which salmon spend the majority of their ocean residence. Results suggest that effective management of Pacific salmon requires multiple types of data and that no single indicator can represent the complex early-ocean ecology of salmon.
Multivariate models of adult Pacific salmon returns.
Directory of Open Access Journals (Sweden)
Brian J Burke
Full Text Available Most modeling and statistical approaches encourage simplicity, yet ecological processes are often complex, as they are influenced by numerous dynamic environmental and biological factors. Pacific salmon abundance has been highly variable over the last few decades and most forecasting models have proven inadequate, primarily because of a lack of understanding of the processes affecting variability in survival. Better methods and data for predicting the abundance of returning adults are therefore required to effectively manage the species. We combined 31 distinct indicators of the marine environment collected over an 11-year period into a multivariate analysis to summarize and predict adult spring Chinook salmon returns to the Columbia River in 2012. In addition to forecasts, this tool quantifies the strength of the relationship between various ecological indicators and salmon returns, allowing interpretation of ecosystem processes. The relative importance of indicators varied, but a few trends emerged. Adult returns of spring Chinook salmon were best described using indicators of bottom-up ecological processes such as composition and abundance of zooplankton and fish prey as well as measures of individual fish, such as growth and condition. Local indicators of temperature or coastal upwelling did not contribute as much as large-scale indicators of temperature variability, matching the spatial scale over which salmon spend the majority of their ocean residence. Results suggest that effective management of Pacific salmon requires multiple types of data and that no single indicator can represent the complex early-ocean ecology of salmon.
A Simplified Approach to Multivariable Model Predictive Control
Directory of Open Access Journals (Sweden)
Michael Short
2015-01-01
Full Text Available The benefits of applying the range of technologies generally known as Model Predictive Control (MPC to the control of industrial processes have been well documented in recent years. One of the principal drawbacks to MPC schemes are the relatively high on-line computational burdens when used with adaptive, constrained and/or multivariable processes, which has warranted some researchers and practitioners to seek simplified approaches for its implementation. To date, several schemes have been proposed based around a simplified 1-norm formulation of multivariable MPC, which is solved online using the simplex algorithm in both the unconstrained and constrained cases. In this paper a 2-norm approach to simplified multivariable MPC is formulated, which is solved online using a vector-matrix product or a simple iterative coordinate descent algorithm for the unconstrained and constrained cases respectively. A CARIMA model is employed to ensure offset-free control, and a simple scheme to produce the optimal predictions is described. A small simulation study and further discussions help to illustrate that this quadratic formulation performs well and can be considered a useful adjunct to its linear counterpart, and still retains the beneficial features such as ease of computer-based implementation.
Multivariate linear models and repeated measurements revisited
DEFF Research Database (Denmark)
Dalgaard, Peter
2009-01-01
Methods for generalized analysis of variance based on multivariate normal theory have been known for many years. In a repeated measurements context, it is most often of interest to consider transformed responses, typically within-subject contrasts or averages. Efficiency considerations leads...
A New Model of Peaks over Threshold for Multivariate Extremes
Institute of Scientific and Technical Information of China (English)
罗耀; 朱良生
2014-01-01
The peaks over threshold (POT) methods are used for the univariate and multivariate extreme value analyses of the wave and wind records collected from a hydrometric station in the South China Sea. A new multivariate POT method:Multivariate GPD (MGPD) model is proposed, which can be built easily according to developed parametric models and is a natural distribution of multivariate POT methods. A joint threshold selection approach is used in the MGPD model well. Finally, sensitivity analyses are carried out to calculate the return values of the base shear, and two declustering schemes are compared in this study.
Modelling subset multivariate ARCH model via the AIC principle
Institute of Scientific and Technical Information of China (English)
无
2002-01-01
In this paper we consider the problem of identifying a parsimonious subset multivariate ARCH model based on the AIC principle. The proposed approach can reduce the number of parameters in the final ARCH specification and allows for non-constant correlations between the components. Some simulation results illustrate the viability of the proposed procedure.
Small Sample Properties of Bayesian Multivariate Autoregressive Time Series Models
Price, Larry R.
2012-01-01
The aim of this study was to compare the small sample (N = 1, 3, 5, 10, 15) performance of a Bayesian multivariate vector autoregressive (BVAR-SEM) time series model relative to frequentist power and parameter estimation bias. A multivariate autoregressive model was developed based on correlated autoregressive time series vectors of varying…
Fractional and multivariable calculus model building and optimization problems
Mathai, A M
2017-01-01
This textbook presents a rigorous approach to multivariable calculus in the context of model building and optimization problems. This comprehensive overview is based on lectures given at five SERC Schools from 2008 to 2012 and covers a broad range of topics that will enable readers to understand and create deterministic and nondeterministic models. Researchers, advanced undergraduate, and graduate students in mathematics, statistics, physics, engineering, and biological sciences will find this book to be a valuable resource for finding appropriate models to describe real-life situations. The first chapter begins with an introduction to fractional calculus moving on to discuss fractional integrals, fractional derivatives, fractional differential equations and their solutions. Multivariable calculus is covered in the second chapter and introduces the fundamentals of multivariable calculus (multivariable functions, limits and continuity, differentiability, directional derivatives and expansions of multivariable ...
Mittal, Surabhi; Mehar, Mamta
2016-01-01
Purpose: The paper analyzes factors that affect the likelihood of adoption of different agriculture-related information sources by farmers. Design/Methodology/Approach: The paper links the theoretical understanding of the existing multiple sources of information that farmers use, with the empirical model to analyze the factors that affect the…
Modelling and Forecasting Multivariate Realized Volatility
DEFF Research Database (Denmark)
Chiriac, Roxana; Voev, Valeri
. We provide an empirical application of the model, in which we show by means of stochastic dominance tests that the returns from an optimal portfolio based on the model's forecasts second-order dominate returns of portfolios optimized on the basis of traditional MGARCH models. This result implies...
The value of multivariate model sophistication
DEFF Research Database (Denmark)
Rombouts, Jeroen; Stentoft, Lars; Violante, Francesco
2014-01-01
in their specification of the conditional variance, conditional correlation, innovation distribution, and estimation approach. All of the models belong to the dynamic conditional correlation class, which is particularly suitable because it allows consistent estimations of the risk neutral dynamics with a manageable...... of correlation models, we propose a new model that allows for correlation spillovers without too many parameters. This model performs about 60% better than the existing correlation models we consider. Relaxing a Gaussian innovation for a Laplace innovation assumption improves the pricing in a more minor way...
Sparse Multivariate Modeling: Priors and Applications
DEFF Research Database (Denmark)
Henao, Ricardo
modeling, a model for peptide-protein/protein-protein interactions called latent protein tree, a framework for sparse Gaussian process classification based on active set selection and a linear multi-category sparse classifier specially targeted to gene expression data. The thesis is organized to provide......This thesis presents a collection of statistical models that attempt to take advantage of every piece of prior knowledge available to provide the models with as much structure as possible. The main motivation for introducing these models is interpretability since in practice we want to be able...... to use them as hypothesis generating tools. All of our models start from a family of structures, for instance factor models, directed acyclic graphs, classifiers, etc. Then we let them be selectively sparse as a way to provide them with structural fl exibility and interpretability. Finally, we complement...
Multivariate Term Structure Models with Level and Heteroskedasticity Effects
DEFF Research Database (Denmark)
Christiansen, Charlotte
2005-01-01
The paper introduces and estimates a multivariate level-GARCH model for the long rate and the term-structure spread where the conditional volatility is proportional to the ãth power of the variable itself (level effects) and the conditional covariance matrix evolves according to a multivariate GA...... and the level model. GARCH effects are more important than level effects. The results are robust to the maturity of the interest rates. Udgivelsesdato: MAY...
Testing for Causality in Variance Usinf Multivariate GARCH Models
Christian M. Hafner; Herwartz, Helmut
2008-01-01
Tests of causality in variance in multiple time series have been proposed recently, based on residuals of estimated univariate models. Although such tests are applied frequently, little is known about their power properties. In this paper we show that a convenient alternative to residual based testing is to specify a multivariate volatility model, such as multivariate GARCH (or BEKK), and construct a Wald test on noncausality in variance. We compare both approaches to testing causality in var...
Testing for causality in variance using multivariate GARCH models
Hafner, Christian; Herwartz, H.
2004-01-01
textabstractTests of causality in variance in multiple time series have been proposed recently, based on residuals of estimated univariate models. Although such tests are applied frequently little is known about their power properties. In this paper we show that a convenient alternative to residual based testing is to specify a multivariate volatility model, such as multivariate GARCH (or BEKK), and construct a Wald test on noncausality in variance. We compare both approaches to testing causa...
Multivariate Survival Mixed Models for Genetic Analysis of Longevity Traits
DEFF Research Database (Denmark)
Pimentel Maia, Rafael; Madsen, Per; Labouriau, Rodrigo
2013-01-01
A class of multivariate mixed survival models for continuous and discrete time with a complex covariance structure is introduced in a context of quantitative genetic applications. The methods introduced can be used in many applications in quantitative genetics although the discussion presented...... concentrates on longevity studies. The framework presented allows to combine models based on continuous time with models based on discrete time in a joint analysis. The continuous time models are approximations of the frailty model in which the hazard function will be assumed to be piece-wise constant....... The discrete time models used are multivariate variants of the discrete relative risk models. These models allow for regular parametric likelihood-based inference by exploring a coincidence of their likelihood functions and the likelihood functions of suitably defined multivariate generalized linear mixed...
Multivariate Survival Mixed Models for Genetic Analysis of Longevity Traits
DEFF Research Database (Denmark)
Pimentel Maia, Rafael; Madsen, Per; Labouriau, Rodrigo
2014-01-01
A class of multivariate mixed survival models for continuous and discrete time with a complex covariance structure is introduced in a context of quantitative genetic applications. The methods introduced can be used in many applications in quantitative genetics although the discussion presented...... concentrates on longevity studies. The framework presented allows to combine models based on continuous time with models based on discrete time in a joint analysis. The continuous time models are approximations of the frailty model in which the hazard function will be assumed to be piece-wise constant....... The discrete time models used are multivariate variants of the discrete relative risk models. These models allow for regular parametric likelihood-based inference by exploring a coincidence of their likelihood functions and the likelihood functions of suitably defined multivariate generalized linear mixed...
A Multivariate Model of Achievement in Geometry
Bailey, MarLynn; Taasoobshirazi, Gita; Carr, Martha
2014-01-01
Previous studies have shown that several key variables influence student achievement in geometry, but no research has been conducted to determine how these variables interact. A model of achievement in geometry was tested on a sample of 102 high school students. Structural equation modeling was used to test hypothesized relationships among…
A Multivariate Model of Conceptual Change
Taasoobshirazi, Gita; Heddy, Benjamin; Bailey, MarLynn; Farley, John
2016-01-01
The present study used the Cognitive Reconstruction of Knowledge Model (CRKM) model of conceptual change as a framework for developing and testing how key cognitive, motivational, and emotional variables are linked to conceptual change in physics. This study extends an earlier study developed by Taasoobshirazi and Sinatra ("J Res Sci…
A Multivariate Model of Achievement in Geometry
Bailey, MarLynn; Taasoobshirazi, Gita; Carr, Martha
2014-01-01
Previous studies have shown that several key variables influence student achievement in geometry, but no research has been conducted to determine how these variables interact. A model of achievement in geometry was tested on a sample of 102 high school students. Structural equation modeling was used to test hypothesized relationships among…
Modelling and Forecasting Multivariate Realized Volatility
DEFF Research Database (Denmark)
Halbleib, Roxana; Voev, Valeri
2011-01-01
This paper proposes a methodology for dynamic modelling and forecasting of realized covariance matrices based on fractionally integrated processes. The approach allows for flexible dependence patterns and automatically guarantees positive definiteness of the forecast. We provide an empirical appl...
A multivariate heuristic model for fuzzy time-series forecasting.
Huarng, Kun-Huang; Yu, Tiffany Hui-Kuang; Hsu, Yu Wei
2007-08-01
Fuzzy time-series models have been widely applied due to their ability to handle nonlinear data directly and because no rigid assumptions for the data are needed. In addition, many such models have been shown to provide better forecasting results than their conventional counterparts. However, since most of these models require complicated matrix computations, this paper proposes the adoption of a multivariate heuristic function that can be integrated with univariate fuzzy time-series models into multivariate models. Such a multivariate heuristic function can easily be extended and integrated with various univariate models. Furthermore, the integrated model can handle multiple variables to improve forecasting results and, at the same time, avoid complicated computations due to the inclusion of multiple variables.
Structural Equation Modeling of Multivariate Time Series
du Toit, Stephen H. C.; Browne, Michael W.
2007-01-01
The covariance structure of a vector autoregressive process with moving average residuals (VARMA) is derived. It differs from other available expressions for the covariance function of a stationary VARMA process and is compatible with current structural equation methodology. Structural equation modeling programs, such as LISREL, may therefore be…
Multivariable Wind Modeling in State Space
DEFF Research Database (Denmark)
Sichani, Mahdi Teimouri; Pedersen, B. J.
2011-01-01
Turbulence of the incoming wind field is of paramount importance to the dynamic response of wind turbines. Hence reliable stochastic models of the turbulence should be available from which time series can be generated for dynamic response and structural safety analysis. In the paper an empirical...... cross-spectral density function for the along-wind turbulence component over the rotor plane is taken as the starting point. The spectrum is spatially discretized in terms of a Hermitian cross-spectral density matrix for the turbulence state vector which turns out not to be positive definite. Since...... the succeeding state space and ARMA modeling of the turbulence rely on the positive definiteness of the cross-spectral density matrix, the problem with the non-positive definiteness of such matrices is at first addressed and suitable treatments regarding it are proposed. From the adjusted positive definite cross...
Calibrated predictions for multivariate competing risks models.
Gorfine, Malka; Hsu, Li; Zucker, David M; Parmigiani, Giovanni
2014-04-01
Prediction models for time-to-event data play a prominent role in assessing the individual risk of a disease, such as cancer. Accurate disease prediction models provide an efficient tool for identifying individuals at high risk, and provide the groundwork for estimating the population burden and cost of disease and for developing patient care guidelines. We focus on risk prediction of a disease in which family history is an important risk factor that reflects inherited genetic susceptibility, shared environment, and common behavior patterns. In this work family history is accommodated using frailty models, with the main novel feature being allowing for competing risks, such as other diseases or mortality. We show through a simulation study that naively treating competing risks as independent right censoring events results in non-calibrated predictions, with the expected number of events overestimated. Discrimination performance is not affected by ignoring competing risks. Our proposed prediction methodologies correctly account for competing events, are very well calibrated, and easy to implement.
Multivariate statistical modelling based on generalized linear models
Fahrmeir, Ludwig
1994-01-01
This book is concerned with the use of generalized linear models for univariate and multivariate regression analysis. Its emphasis is to provide a detailed introductory survey of the subject based on the analysis of real data drawn from a variety of subjects including the biological sciences, economics, and the social sciences. Where possible, technical details and proofs are deferred to an appendix in order to provide an accessible account for non-experts. Topics covered include: models for multi-categorical responses, model checking, time series and longitudinal data, random effects models, and state-space models. Throughout, the authors have taken great pains to discuss the underlying theoretical ideas in ways that relate well to the data at hand. As a result, numerous researchers whose work relies on the use of these models will find this an invaluable account to have on their desks. "The basic aim of the authors is to bring together and review a large part of recent advances in statistical modelling of m...
Evaluating multivariate GARCH models in the Nordic electricity markets
Energy Technology Data Exchange (ETDEWEB)
Malo, P.; Kanto, A.
2005-07-01
This paper considers a variety of specification tests for multivariate GARCH models that are used in dynamic hedging in the electricity markets. The test statistics include the robust conditional moments tests for sign-size bias along with the recently introduced copula tests for an appropriate dependence structure. We consider this effort worthwhile, since quite often the tests of multivariate GARCH models are easily omitted and the models become selected ad-hoc depending on the results they generate. Hedging performance comparisons, in terms of unconditional and conditional ex-post variance portfolio reduction, are conducted. (orig.)
Genetic association in multivariate phenotypic data: power in five models
Minica, C.C.; Boomsma, D.I.; van der Sluis, S.; Dolan, C.V.
2010-01-01
This article concerns the power of various data analytic strategies to detect the effect of a single genetic variant (GV) in multivariate data. We simulated exactly fitting monozygotic and dizygotic phenotypic data according to single and two common factor models, and simplex models. We calculated t
Robust Ranking of Multivariate GARCH Models by Problem Dimension
M. Caporin (Massimiliano); M.J. McAleer (Michael)
2012-01-01
textabstractDuring the last 15 years, several Multivariate GARCH (MGARCH) models have appeared in the literature. Recent research has begun to examine MGARCH specifications in terms of their out-of-sample forecasting performance. We provide an empirical comparison of alternative MGARCH models, namel
Ranking Multivariate GARCH Models by Problem Dimension: An Empirical Evaluation
M. Caporin (Massimiliano); M.J. McAleer (Michael)
2011-01-01
textabstractIn the last 15 years, several Multivariate GARCH (MGARCH) models have appeared in the literature. Recent research has begun to examine MGARCH specifications in terms of their out-of-sample forecasting performance. In this paper, we provide an empirical comparison of a set of models, name
Multivariate Variance Targeting in the BEKK-GARCH Model
DEFF Research Database (Denmark)
Pedersen, Rasmus Søndergaard; Rahbek, Anders
This paper considers asymptotic inference in the multivariate BEKK model based on (co-)variance targeting (VT). By de…nition the VT estimator is a two-step estimator and the theory presented is based on expansions of the modi…ed likelihood function, or estimating function, corresponding to these ......This paper considers asymptotic inference in the multivariate BEKK model based on (co-)variance targeting (VT). By de…nition the VT estimator is a two-step estimator and the theory presented is based on expansions of the modi…ed likelihood function, or estimating function, corresponding...... to these two steps. Strong consistency is established under weak moment conditions, while sixth order moment restrictions are imposed to establish asymptotic normality. Included simulations indicate that the multivariately induced higher-order moment constraints are indeed necessary....
Piecewise multivariate modelling of sequential metabolic profiling data
Directory of Open Access Journals (Sweden)
Nicholson Jeremy K
2008-02-01
Full Text Available Abstract Background Modelling the time-related behaviour of biological systems is essential for understanding their dynamic responses to perturbations. In metabolic profiling studies, the sampling rate and number of sampling points are often restricted due to experimental and biological constraints. Results A supervised multivariate modelling approach with the objective to model the time-related variation in the data for short and sparsely sampled time-series is described. A set of piecewise Orthogonal Projections to Latent Structures (OPLS models are estimated, describing changes between successive time points. The individual OPLS models are linear, but the piecewise combination of several models accommodates modelling and prediction of changes which are non-linear with respect to the time course. We demonstrate the method on both simulated and metabolic profiling data, illustrating how time related changes are successfully modelled and predicted. Conclusion The proposed method is effective for modelling and prediction of short and multivariate time series data. A key advantage of the method is model transparency, allowing easy interpretation of time-related variation in the data. The method provides a competitive complement to commonly applied multivariate methods such as OPLS and Principal Component Analysis (PCA for modelling and analysis of short time-series data.
Multivariate Receptor Models for Spatially Correlated Multipollutant Data
Jun, Mikyoung
2013-08-01
The goal of multivariate receptor modeling is to estimate the profiles of major pollution sources and quantify their impacts based on ambient measurements of pollutants. Traditionally, multivariate receptor modeling has been applied to multiple air pollutant data measured at a single monitoring site or measurements of a single pollutant collected at multiple monitoring sites. Despite the growing availability of multipollutant data collected from multiple monitoring sites, there has not yet been any attempt to incorporate spatial dependence that may exist in such data into multivariate receptor modeling. We propose a spatial statistics extension of multivariate receptor models that enables us to incorporate spatial dependence into estimation of source composition profiles and contributions given the prespecified number of sources and the model identification conditions. The proposed method yields more precise estimates of source profiles by accounting for spatial dependence in the estimation. More importantly, it enables predictions of source contributions at unmonitored sites as well as when there are missing values at monitoring sites. The method is illustrated with simulated data and real multipollutant data collected from eight monitoring sites in Harris County, Texas. Supplementary materials for this article, including data and R code for implementing the methods, are available online on the journal web site. © 2013 Copyright Taylor and Francis Group, LLC.
Critical elements on fitting the Bayesian multivariate Poisson Lognormal model
Zamzuri, Zamira Hasanah binti
2015-10-01
Motivated by a problem on fitting multivariate models to traffic accident data, a detailed discussion of the Multivariate Poisson Lognormal (MPL) model is presented. This paper reveals three critical elements on fitting the MPL model: the setting of initial estimates, hyperparameters and tuning parameters. These issues have not been highlighted in the literature. Based on simulation studies conducted, we have shown that to use the Univariate Poisson Model (UPM) estimates as starting values, at least 20,000 iterations are needed to obtain reliable final estimates. We also illustrated the sensitivity of the specific hyperparameter, which if it is not given extra attention, may affect the final estimates. The last issue is regarding the tuning parameters where they depend on the acceptance rate. Finally, a heuristic algorithm to fit the MPL model is presented. This acts as a guide to ensure that the model works satisfactorily given any data set.
Analyzing the Dynamics of Nonlinear Multivariate Time Series Models
Institute of Scientific and Technical Information of China (English)
DenghuaZhong; ZhengfengZhang; DonghaiLiu; StefanMittnik
2004-01-01
This paper analyzes the dynamics of nonlinear multivariate time series models that is represented by generalized impulse response functions and asymmetric functions. We illustrate the measures of shock persistences and asymmetric effects of shocks derived from the generalized impulse response functions and asymmetric function in bivariate smooth transition regression models. The empirical work investigates a bivariate smooth transition model of US GDP and the unemployment rate.
Modelling, simulation and inference for multivariate time series of counts
Veraart, Almut E. D.
2016-01-01
This article presents a new continuous-time modelling framework for multivariate time series of counts which have an infinitely divisible marginal distribution. The model is based on a mixed moving average process driven by L\\'{e}vy noise - called a trawl process - where the serial correlation and the cross-sectional dependence are modelled independently of each other. Such processes can exhibit short or long memory. We derive a stochastic simulation algorithm and a statistical inference meth...
Fast Filtering and Smoothing for Multivariate State Space Models
Koopman, S.J.M.; Durbin, J.
1998-01-01
This paper gives a new approach to diffuse filtering and smoothing for multivariate state space models. The standard approach treats the observations as vectors while our approach treats each element of the observational vector individually. This strategy leads to computationally efficient methods f
Multivariate Variance Targeting in the BEKK-GARCH Model
DEFF Research Database (Denmark)
Pedersen, Rasmus Søndergaard; Rahbek, Anders
This paper considers asymptotic inference in the multivariate BEKK model based on (co-)variance targeting (VT). By de…nition the VT estimator is a two-step estimator and the theory presented is based on expansions of the modi…ed like- lihood function, or estimating function, corresponding...
Multivariate Variance Targeting in the BEKK-GARCH Model
DEFF Research Database (Denmark)
Pedersen, Rasmus Søndergaard; Rahbek, Anders
This paper considers asymptotic inference in the multivariate BEKK model based on (co-)variance targeting (VT). By de…nition the VT estimator is a two-step estimator and the theory presented is based on expansions of the modi…ed like- lihood function, or estimating function, corresponding...
Multivariate variance targeting in the BEKK-GARCH model
DEFF Research Database (Denmark)
Pedersen, Rasmus S.; Rahbæk, Anders
2014-01-01
This paper considers asymptotic inference in the multivariate BEKK model based on (co-)variance targeting (VT). By definition the VT estimator is a two-step estimator and the theory presented is based on expansions of the modified likelihood function, or estimating function, corresponding...
Multivariate variance targeting in the BEKK-GARCH model
DEFF Research Database (Denmark)
Pedersen, Rasmus S.; Rahbæk, Anders
2014-01-01
This paper considers asymptotic inference in the multivariate BEKK model based on (co-)variance targeting (VT). By definition the VT estimator is a two-step estimator and the theory presented is based on expansions of the modified likelihood function, or estimating function, corresponding...... to these two steps. Strong consis-tency is established under weak moment conditions, while sixth-order moment restrictions are imposed to establish asymptotic normality. Included simulations indicate that the multivariately induced higher-order moment constraints are necessary...
Multivariate Term Structure Models with Level and Heteroskedasticity Effects
DEFF Research Database (Denmark)
Christiansen, Charlotte
2005-01-01
The paper introduces and estimates a multivariate level-GARCH model for the long rate and the term-structure spread where the conditional volatility is proportional to the ãth power of the variable itself (level effects) and the conditional covariance matrix evolves according to a multivariate...... GARCH process (heteroskedasticity effects). The long-rate variance exhibits heteroskedasticity effects and level effects in accordance with the square-root model. The spread variance exhibits heteroskedasticity effects but no level effects. The level-GARCH model is preferred above the GARCH model...... and the level model. GARCH effects are more important than level effects. The results are robust to the maturity of the interest rates. Udgivelsesdato: MAY...
Robust Ranking of Multivariate GARCH Models by Problem Dimension
McAleer, Michael; Caporin, Massimiliano
2012-01-01
textabstractDuring the last 15 years, several Multivariate GARCH (MGARCH) models have appeared in the literature. Recent research has begun to examine MGARCH specifications in terms of their out-of-sample forecasting performance. We provide an empirical comparison of alternative MGARCH models, namely BEKK, DCC, Corrected DCC (cDCC), CCC, OGARCH Exponentially Weighted Moving Average, and covariance shrinking, using historical data for 89 US equities. We contribute to the literature in several ...
Multivariate DCC-GARCH Model: -With Various Error Distributions
Orskaug, Elisabeth
2009-01-01
In this thesis we have studied the DCC-GARCH model with Gaussian, Student's $t$ and skew Student's t-distributed errors. For a basic understanding of the GARCH model, the univariate GARCH and multivariate GARCH models in general were discussed before the DCC-GARCH model was considered. The Maximum likelihood method is used to estimate the parameters. The estimation of the correctly specified likelihood is difficult, and hence the DCC-model was designed to allow for two stage estim...
Multivariate Product-Shot-noise Cox Point Process Models
DEFF Research Database (Denmark)
Jalilian, Abdollah; Guan, Yongtao; Mateu, Jorge
We introduce a new multivariate product-shot-noise Cox process which is useful for model- ing multi-species spatial point patterns with clustering intra-specific interactions and neutral, negative or positive inter-specific interactions. The auto and cross pair correlation functions of the proces...... can be obtained in closed analytical forms and approximate simulation of the process is straightforward. We use the proposed process to model interactions within and among five tree species in the Barro Colorado Island plot.......We introduce a new multivariate product-shot-noise Cox process which is useful for model- ing multi-species spatial point patterns with clustering intra-specific interactions and neutral, negative or positive inter-specific interactions. The auto and cross pair correlation functions of the process...
The multivariate supOU stochastic volatility model
DEFF Research Database (Denmark)
Barndorff-Nielsen, Ole; Stelzer, Robert
structure of the volatility, the log returns, as well as their "squares" are discussed in detail. Moreover, we give several examples in which long memory effects occur and study how the model as well as the simple Ornstein-Uhlenbeck type stochastic volatility model behave under linear transformations......Using positive semidefinite supOU (superposition of Ornstein-Uhlenbeck type) processes to describe the volatility, we introduce a multivariate stochastic volatility model for financial data which is capable of modelling long range dependence effects. The finiteness of moments and the second order...
Capturing Multivariate Spatial Dependence: Model, Estimate and then Predict
Cressie, Noel; Burden, Sandy; Davis, Walter; Krivitsky, Pavel N.; Mokhtarian, Payam; Suesse, Thomas; Zammit-Mangion, Andrew
2015-01-01
Physical processes rarely occur in isolation, rather they influence and interact with one another. Thus, there is great benefit in modeling potential dependence between both spatial locations and different processes. It is the interaction between these two dependencies that is the focus of Genton and Kleiber's paper under discussion. We see the problem of ensuring that any multivariate spatial covariance matrix is nonnegative definite as important, but we also see it as a means to an end. Tha...
Analysis of Forest Foliage Using a Multivariate Mixture Model
Hlavka, C. A.; Peterson, David L.; Johnson, L. F.; Ganapol, B.
1997-01-01
Data with wet chemical measurements and near infrared spectra of ground leaf samples were analyzed to test a multivariate regression technique for estimating component spectra which is based on a linear mixture model for absorbance. The resulting unmixed spectra for carbohydrates, lignin, and protein resemble the spectra of extracted plant starches, cellulose, lignin, and protein. The unmixed protein spectrum has prominent absorption spectra at wavelengths which have been associated with nitrogen bonds.
Bayes linear covariance matrix adjustment for multivariate dynamic linear models
Wilkinson, Darren J
2008-01-01
A methodology is developed for the adjustment of the covariance matrices underlying a multivariate constant time series dynamic linear model. The covariance matrices are embedded in a distribution-free inner-product space of matrix objects which facilitates such adjustment. This approach helps to make the analysis simple, tractable and robust. To illustrate the methods, a simple model is developed for a time series representing sales of certain brands of a product from a cash-and-carry depot. The covariance structure underlying the model is revised, and the benefits of this revision on first order inferences are then examined.
Exchange Rate Forecasting Using Entropy Optimized Multivariate Wavelet Denoising Model
Directory of Open Access Journals (Sweden)
Kaijian He
2014-01-01
Full Text Available Exchange rate is one of the key variables in the international economics and international trade. Its movement constitutes one of the most important dynamic systems, characterized by nonlinear behaviors. It becomes more volatile and sensitive to increasingly diversified influencing factors with higher level of deregulation and global integration worldwide. Facing the increasingly diversified and more integrated market environment, the forecasting model in the exchange markets needs to address the individual and interdependent heterogeneity. In this paper, we propose the heterogeneous market hypothesis- (HMH- based exchange rate modeling methodology to model the micromarket structure. Then we further propose the entropy optimized wavelet-based forecasting algorithm under the proposed methodology to forecast the exchange rate movement. The multivariate wavelet denoising algorithm is used to separate and extract the underlying data components with distinct features, which are modeled with multivariate time series models of different specifications and parameters. The maximum entropy is introduced to select the best basis and model parameters to construct the most effective forecasting algorithm. Empirical studies in both Chinese and European markets have been conducted to confirm the significant performance improvement when the proposed model is tested against the benchmark models.
The following SAS macros can be used to create a multivariate usual intake distribution for multiple dietary components that are consumed nearly every day or episodically. A SAS macro for performing balanced repeated replication (BRR) variance estimation is also included.
Regularized multivariate regression models with skew-t error distributions
Chen, Lianfu
2014-06-01
We consider regularization of the parameters in multivariate linear regression models with the errors having a multivariate skew-t distribution. An iterative penalized likelihood procedure is proposed for constructing sparse estimators of both the regression coefficient and inverse scale matrices simultaneously. The sparsity is introduced through penalizing the negative log-likelihood by adding L1-penalties on the entries of the two matrices. Taking advantage of the hierarchical representation of skew-t distributions, and using the expectation conditional maximization (ECM) algorithm, we reduce the problem to penalized normal likelihood and develop a procedure to minimize the ensuing objective function. Using a simulation study the performance of the method is assessed, and the methodology is illustrated using a real data set with a 24-dimensional response vector. © 2014 Elsevier B.V.
Multivariate data assimilation in an integrated hydrological modelling system
Madsen, Henrik; Zhang, Donghua; Ridler, Marc; Refsgaard, Jens Christian; Høgh Jensen, Karsten
2016-04-01
The immensely increasing availability of in-situ and remotely sensed hydrological data has offered new opportunities for monitoring and forecasting water resources by combining observation data with hydrological modelling. Efficient multivariate data assimilation in integrated groundwater - surface water hydrological modelling systems are required to fully utilize and optimally combine the different types of observation data. A particular challenge is the assimilation of observation data of different hydrological variables from different monitoring instruments, representing a wide range of spatial and temporal scales and different levels of uncertainty. A multivariate data assimilation framework has been implemented in the MIKE SHE integrated hydrological modelling system by linking the MIKE SHE code with a generic data assimilation library. The data assimilation library supports different state-of-the-art ensemble-based Kalman filter methods, and includes procedures for localisation, joint state, parameter and model error estimation, and bias-aware filtering. Furthermore, it supports use of different stochastic error models to describe model and measurement errors. Results are presented that demonstrate the use of the data assimilation framework for assimilation of different data types in a catchment-scale MIKE SHE model.
Enhancing scientific reasoning by refining students' models of multivariable causality
Keselman, Alla
Inquiry learning as an educational method is gaining increasing support among elementary and middle school educators. In inquiry activities at the middle school level, students are typically asked to conduct investigations and infer causal relationships about multivariable causal systems. In these activities, students usually demonstrate significant strategic weaknesses and insufficient metastrategic understanding of task demands. Present work suggests that these weaknesses arise from students' deficient mental models of multivariable causality, in which effects of individual features are neither additive, nor constant. This study is an attempt to develop an intervention aimed at enhancing scientific reasoning by refining students' models of multivariable causality. Three groups of students engaged in a scientific investigation activity over seven weekly sessions. By creating unique combinations of five features potentially involved in earthquake mechanism and observing associated risk meter readings, students had to find out which of the features were causal, and to learn to predict earthquake risk. Additionally, students in the instructional and practice groups engaged in self-directed practice in making scientific predictions. The instructional group also participated in weekly instructional sessions on making predictions based on multivariable causality. Students in the practice and instructional conditions showed small to moderate improvement in their attention to the evidence and in their metastrategic ability to recognize effective investigative strategies in the work of other students. They also demonstrated a trend towards making a greater number of valid inferences than the control group students. Additionally, students in the instructional condition showed significant improvement in their ability to draw inferences based on multiple records. They also developed more accurate knowledge about non-causal features of the system. These gains were maintained
Multivariate Statistical Modelling of Drought and Heat Wave Events
Manning, Colin; Widmann, Martin; Vrac, Mathieu; Maraun, Douglas; Bevaqua, Emanuele
2016-04-01
Multivariate Statistical Modelling of Drought and Heat Wave Events C. Manning1,2, M. Widmann1, M. Vrac2, D. Maraun3, E. Bevaqua2,3 1. School of Geography, Earth and Environmental Sciences, University of Birmingham, Edgbaston, Birmingham, UK 2. Laboratoire des Sciences du Climat et de l'Environnement, (LSCE-IPSL), Centre d'Etudes de Saclay, Gif-sur-Yvette, France 3. Wegener Center for Climate and Global Change, University of Graz, Brandhofgasse 5, 8010 Graz, Austria Compound extreme events are a combination of two or more contributing events which in themselves may not be extreme but through their joint occurrence produce an extreme impact. Compound events are noted in the latest IPCC report as an important type of extreme event that have been given little attention so far. As part of the CE:LLO project (Compound Events: muLtivariate statisticaL mOdelling) we are developing a multivariate statistical model to gain an understanding of the dependence structure of certain compound events. One focus of this project is on the interaction between drought and heat wave events. Soil moisture has both a local and non-local effect on the occurrence of heat waves where it strongly controls the latent heat flux affecting the transfer of sensible heat to the atmosphere. These processes can create a feedback whereby a heat wave maybe amplified or suppressed by the soil moisture preconditioning, and vice versa, the heat wave may in turn have an effect on soil conditions. An aim of this project is to capture this dependence in order to correctly describe the joint probabilities of these conditions and the resulting probability of their compound impact. We will show an application of Pair Copula Constructions (PCCs) to study the aforementioned compound event. PCCs allow in theory for the formulation of multivariate dependence structures in any dimension where the PCC is a decomposition of a multivariate distribution into a product of bivariate components modelled using copulas. A
Multivariate moment closure techniques for stochastic kinetic models
Lakatos, Eszter; Ale, Angelique; Kirk, Paul D. W.; Stumpf, Michael P. H.
2015-09-01
Stochastic effects dominate many chemical and biochemical processes. Their analysis, however, can be computationally prohibitively expensive and a range of approximation schemes have been proposed to lighten the computational burden. These, notably the increasingly popular linear noise approximation and the more general moment expansion methods, perform well for many dynamical regimes, especially linear systems. At higher levels of nonlinearity, it comes to an interplay between the nonlinearities and the stochastic dynamics, which is much harder to capture correctly by such approximations to the true stochastic processes. Moment-closure approaches promise to address this problem by capturing higher-order terms of the temporally evolving probability distribution. Here, we develop a set of multivariate moment-closures that allows us to describe the stochastic dynamics of nonlinear systems. Multivariate closure captures the way that correlations between different molecular species, induced by the reaction dynamics, interact with stochastic effects. We use multivariate Gaussian, gamma, and lognormal closure and illustrate their use in the context of two models that have proved challenging to the previous attempts at approximating stochastic dynamics: oscillations in p53 and Hes1. In addition, we consider a larger system, Erk-mediated mitogen-activated protein kinases signalling, where conventional stochastic simulation approaches incur unacceptably high computational costs.
Multivariate moment closure techniques for stochastic kinetic models
Energy Technology Data Exchange (ETDEWEB)
Lakatos, Eszter, E-mail: e.lakatos13@imperial.ac.uk; Ale, Angelique; Kirk, Paul D. W.; Stumpf, Michael P. H., E-mail: m.stumpf@imperial.ac.uk [Department of Life Sciences, Centre for Integrative Systems Biology and Bioinformatics, Imperial College London, London SW7 2AZ (United Kingdom)
2015-09-07
Stochastic effects dominate many chemical and biochemical processes. Their analysis, however, can be computationally prohibitively expensive and a range of approximation schemes have been proposed to lighten the computational burden. These, notably the increasingly popular linear noise approximation and the more general moment expansion methods, perform well for many dynamical regimes, especially linear systems. At higher levels of nonlinearity, it comes to an interplay between the nonlinearities and the stochastic dynamics, which is much harder to capture correctly by such approximations to the true stochastic processes. Moment-closure approaches promise to address this problem by capturing higher-order terms of the temporally evolving probability distribution. Here, we develop a set of multivariate moment-closures that allows us to describe the stochastic dynamics of nonlinear systems. Multivariate closure captures the way that correlations between different molecular species, induced by the reaction dynamics, interact with stochastic effects. We use multivariate Gaussian, gamma, and lognormal closure and illustrate their use in the context of two models that have proved challenging to the previous attempts at approximating stochastic dynamics: oscillations in p53 and Hes1. In addition, we consider a larger system, Erk-mediated mitogen-activated protein kinases signalling, where conventional stochastic simulation approaches incur unacceptably high computational costs.
Multivariate moment closure techniques for stochastic kinetic models.
Lakatos, Eszter; Ale, Angelique; Kirk, Paul D W; Stumpf, Michael P H
2015-09-07
Stochastic effects dominate many chemical and biochemical processes. Their analysis, however, can be computationally prohibitively expensive and a range of approximation schemes have been proposed to lighten the computational burden. These, notably the increasingly popular linear noise approximation and the more general moment expansion methods, perform well for many dynamical regimes, especially linear systems. At higher levels of nonlinearity, it comes to an interplay between the nonlinearities and the stochastic dynamics, which is much harder to capture correctly by such approximations to the true stochastic processes. Moment-closure approaches promise to address this problem by capturing higher-order terms of the temporally evolving probability distribution. Here, we develop a set of multivariate moment-closures that allows us to describe the stochastic dynamics of nonlinear systems. Multivariate closure captures the way that correlations between different molecular species, induced by the reaction dynamics, interact with stochastic effects. We use multivariate Gaussian, gamma, and lognormal closure and illustrate their use in the context of two models that have proved challenging to the previous attempts at approximating stochastic dynamics: oscillations in p53 and Hes1. In addition, we consider a larger system, Erk-mediated mitogen-activated protein kinases signalling, where conventional stochastic simulation approaches incur unacceptably high computational costs.
Music Genre Classification using the multivariate AR feature integration model
DEFF Research Database (Denmark)
Ahrendt, Peter; Meng, Anders
2005-01-01
informative decisions about musical genre. For the MIREX music genre contest several authors derive long time features based either on statistical moments and/or temporal structure in the short time features. In our contribution we model a segment (1.2 s) of short time features (texture) using a multivariate......Music genre classification systems are normally build as a feature extraction module followed by a classifier. The features are often short-time features with time frames of 10-30ms, although several characteristics of music require larger time scales. Thus, larger time frames are needed to take...... autoregressive model. Other authors have applied simpler statistical models such as the mean-variance model, which also has been included in several of this years MIREX submissions, see e.g. Tzanetakis (2005); Burred (2005); Bergstra et al. (2005); Lidy and Rauber (2005)....
Optimal model-free prediction from multivariate time series.
Runge, Jakob; Donner, Reik V; Kurths, Jürgen
2015-05-01
Forecasting a time series from multivariate predictors constitutes a challenging problem, especially using model-free approaches. Most techniques, such as nearest-neighbor prediction, quickly suffer from the curse of dimensionality and overfitting for more than a few predictors which has limited their application mostly to the univariate case. Therefore, selection strategies are needed that harness the available information as efficiently as possible. Since often the right combination of predictors matters, ideally all subsets of possible predictors should be tested for their predictive power, but the exponentially growing number of combinations makes such an approach computationally prohibitive. Here a prediction scheme that overcomes this strong limitation is introduced utilizing a causal preselection step which drastically reduces the number of possible predictors to the most predictive set of causal drivers making a globally optimal search scheme tractable. The information-theoretic optimality is derived and practical selection criteria are discussed. As demonstrated for multivariate nonlinear stochastic delay processes, the optimal scheme can even be less computationally expensive than commonly used suboptimal schemes like forward selection. The method suggests a general framework to apply the optimal model-free approach to select variables and subsequently fit a model to further improve a prediction or learn statistical dependencies. The performance of this framework is illustrated on a climatological index of El Niño Southern Oscillation.
Mark-specific hazard ratio model with missing multivariate marks.
Juraska, Michal; Gilbert, Peter B
2016-10-01
An objective of randomized placebo-controlled preventive HIV vaccine efficacy (VE) trials is to assess the relationship between vaccine effects to prevent HIV acquisition and continuous genetic distances of the exposing HIVs to multiple HIV strains represented in the vaccine. The set of genetic distances, only observed in failures, is collectively termed the 'mark.' The objective has motivated a recent study of a multivariate mark-specific hazard ratio model in the competing risks failure time analysis framework. Marks of interest, however, are commonly subject to substantial missingness, largely due to rapid post-acquisition viral evolution. In this article, we investigate the mark-specific hazard ratio model with missing multivariate marks and develop two inferential procedures based on (i) inverse probability weighting (IPW) of the complete cases, and (ii) augmentation of the IPW estimating functions by leveraging auxiliary data predictive of the mark. Asymptotic properties and finite-sample performance of the inferential procedures are presented. This research also provides general inferential methods for semiparametric density ratio/biased sampling models with missing data. We apply the developed procedures to data from the HVTN 502 'Step' HIV VE trial.
Multivariate parametric random effect regression models for fecundability studies.
Ecochard, R; Clayton, D G
2000-12-01
Delay until conception is generally described by a mixture of geometric distributions. Weinberg and Gladen (1986, Biometrics 42, 547-560) proposed a regression generalization of the beta-geometric mixture model where covariates effects were expressed in terms of contrasts of marginal hazards. Scheike and Jensen (1997, Biometrics 53, 318-329) developed a frailty model for discrete event times data based on discrete-time analogues of Hougaard's results (1984, Biometrika 71, 75-83). This paper is on a generalization to a three-parameter family distribution and an extension to multivariate cases. The model allows the introduction of explanatory variables, including time-dependent variables at the subject-specific level, together with a choice from a flexible family of random effect distributions. This makes it possible, in the context of medically assisted conception, to include data sources with multiple pregnancies (or attempts at pregnancy) per couple.
Multivariate Logistic Model to estimate Effective Rainfall for an Event
Singh, S. K.; Patil, Sachin; Bárdossy, A.
2009-04-01
Multivariate logistic models are widely used in biological, medical, and social sciences but logistic models are seldom applied to hydrological problems. A logistic function behaves linear in the mid range and tends to be non-linear as it approaches to the extremes, hence it is more flexible than a linear function and capable of dealing with skew-distributed variables. They seem to bear good potential to handle asymmetrically distributed hydrological variables of extreme occurrence. In this study, logistic regression approach is implemented to derive a multivariate logistic function for effective rainfall; in the process runoff coefficient is assumed to be a Bernoulli-distributed dependent variable. A backward stepwise logistic regression procedure was performed to derive the logistic transfer function between runoff coefficient and catchment as well as event variables (e.g. drainage density, soil moisture etc). The investigation was carried out using data base for 244 rainfall-runoff events from 42 mesoscale catchments located in south-west Germany. The performance of the derived logistic transfer function was compared with that of SCS method for estimation of effective rainfall.
Probability boxes on totally preordered spaces for multivariate modelling
Troffaes, Matthias C M; 10.1016/j.ijar.2011.02.001
2011-01-01
A pair of lower and upper cumulative distribution functions, also called probability box or p-box, is among the most popular models used in imprecise probability theory. They arise naturally in expert elicitation, for instance in cases where bounds are specified on the quantiles of a random variable, or when quantiles are specified only at a finite number of points. Many practical and formal results concerning p-boxes already exist in the literature. In this paper, we provide new efficient tools to construct multivariate p-boxes and develop algorithms to draw inferences from them. For this purpose, we formalise and extend the theory of p-boxes using Walley's behavioural theory of imprecise probabilities, and heavily rely on its notion of natural extension and existing results about independence modeling. In particular, we allow p-boxes to be defined on arbitrary totally preordered spaces, hence thereby also admitting multivariate p-boxes via probability bounds over any collection of nested sets. We focus on t...
Implementing Modifed Burg Algorithms in Multivariate Subset Autoregressive Modeling
Directory of Open Access Journals (Sweden)
A. Alexandre Trindade
2003-02-01
Full Text Available The large number of parameters in subset vector autoregressive models often leads one to procure fast, simple, and efficient alternatives or precursors to maximum likelihood estimation. We present the solution of the multivariate subset Yule-Walker equations as one such alternative. In recent work, Brockwell, Dahlhaus, and Trindade (2002, show that the Yule-Walker estimators can actually be obtained as a special case of a general recursive Burg-type algorithm. We illustrate the structure of this Algorithm, and discuss its implementation in a high-level programming language. Applications of the Algorithm in univariate and bivariate modeling are showcased in examples. Univariate and bivariate versions of the Algorithm written in Fortran 90 are included in the appendix, and their use illustrated.
MULTIVARIATE MODEL FOR CORPORATE BANKRUPTCY PREDICTION IN ROMANIA
Directory of Open Access Journals (Sweden)
Daniel BRÎNDESCU – OLARIU
2016-06-01
Full Text Available The current paper proposes a methodology for bankruptcy prediction applicable for Romanian companies. Low bankruptcy frequencies registered in the past have limited the importance of bankruptcy prediction in Romania. The changes in the economic environment brought by the economic crisis, as well as by the entrance in the European Union, make the availability of performing bankruptcy assessment tools more important than ever before. The proposed methodology is centred on a multivariate model, developed through discriminant analysis. Financial ratios are employed as explanatory variables within the model. The study has included 53,252 yearly financial statements from the period 2007 – 2010, with the state of the companies being monitored until the end of 2012. It thus employs the largest sample ever used in Romanian research in the field of bankruptcy prediction, not targeting high levels of accuracy over isolated samples, but reliability and ease of use over the entire population.
A New Multivariate Markov Chain Model for Adding a New Categorical Data Sequence
2014-01-01
We propose a new multivariate Markov chain model for adding a new categorical data sequence. The number of the parameters in the new multivariate Markov chain model is only (3s) less than ((s+1)2) the number of the parameters in the former multivariate Markov chain model. Numerical experiments demonstrate the benefits of the new multivariate Markov chain model on saving computational resources.
Clustering Multivariate Time Series Using Hidden Markov Models
Directory of Open Access Journals (Sweden)
Shima Ghassempour
2014-03-01
Full Text Available In this paper we describe an algorithm for clustering multivariate time series with variables taking both categorical and continuous values. Time series of this type are frequent in health care, where they represent the health trajectories of individuals. The problem is challenging because categorical variables make it difficult to define a meaningful distance between trajectories. We propose an approach based on Hidden Markov Models (HMMs, where we first map each trajectory into an HMM, then define a suitable distance between HMMs and finally proceed to cluster the HMMs with a method based on a distance matrix. We test our approach on a simulated, but realistic, data set of 1,255 trajectories of individuals of age 45 and over, on a synthetic validation set with known clustering structure, and on a smaller set of 268 trajectories extracted from the longitudinal Health and Retirement Survey. The proposed method can be implemented quite simply using standard packages in R and Matlab and may be a good candidate for solving the difficult problem of clustering multivariate time series with categorical variables using tools that do not require advanced statistic knowledge, and therefore are accessible to a wide range of researchers.
a Multivariate Downscaling Model for Nonparametric Simulation of Daily Flows
Molina, J. M.; Ramirez, J. A.; Raff, D. A.
2011-12-01
A multivariate, stochastic nonparametric framework for stepwise disaggregation of seasonal runoff volumes to daily streamflow is presented. The downscaling process is conditional on volumes of spring runoff and large-scale ocean-atmosphere teleconnections and includes a two-level cascade scheme: seasonal-to-monthly disaggregation first followed by monthly-to-daily disaggregation. The non-parametric and assumption-free character of the framework allows consideration of the random nature and nonlinearities of daily flows, which parametric models are unable to account for adequately. This paper examines statistical links between decadal/interannual climatic variations in the Pacific Ocean and hydrologic variability in US northwest region, and includes a periodicity analysis of climate patterns to detect coherences of their cyclic behavior in the frequency domain. We explore the use of such relationships and selected signals (e.g., north Pacific gyre oscillation, southern oscillation, and Pacific decadal oscillation indices, NPGO, SOI and PDO, respectively) in the proposed data-driven framework by means of a combinatorial approach with the aim of simulating improved streamflow sequences when compared with disaggregated series generated from flows alone. A nearest neighbor time series bootstrapping approach is integrated with principal component analysis to resample from the empirical multivariate distribution. A volume-dependent scaling transformation is implemented to guarantee the summability condition. In addition, we present a new and simple algorithm, based on nonparametric resampling, that overcomes the common limitation of lack of preservation of historical correlation between daily flows across months. The downscaling framework presented here is parsimonious in parameters and model assumptions, does not generate negative values, and produces synthetic series that are statistically indistinguishable from the observations. We present evidence showing that both
Modeling the latent dimensions of multivariate signaling datasets
Jensen, Karin J.; Janes, Kevin A.
2012-08-01
Cellular signal transduction is coordinated by modifications of many proteins within cells. Protein modifications are not independent, because some are connected through shared signaling cascades and others jointly converge upon common cellular functions. This coupling creates a hidden structure within a signaling network that can point to higher level organizing principles of interest to systems biology. One can identify important covariations within large-scale datasets by using mathematical models that extract latent dimensions—the key structural elements of a measurement set. In this paper, we introduce two principal component-based methods for identifying and interpreting latent dimensions. Principal component analysis provides a starting point for unbiased inspection of the major sources of variation within a dataset. Partial least-squares regression reorients these dimensions toward a specific hypothesis of interest. Both approaches have been used widely in studies of cell signaling, and they should be standard analytical tools once highly multivariate datasets become straightforward to accumulate.
Bayesian model selection for constrained multivariate normal linear models
Mulder, J.
2010-01-01
The expectations that researchers have about the structure in the data can often be formulated in terms of equality constraints and/or inequality constraints on the parameters in the model that is used. In a (M)AN(C)OVA model, researchers have expectations about the differences between the
Sparse multivariate autoregressive modeling for mild cognitive impairment classification.
Li, Yang; Wee, Chong-Yaw; Jie, Biao; Peng, Ziwen; Shen, Dinggang
2014-07-01
Brain connectivity network derived from functional magnetic resonance imaging (fMRI) is becoming increasingly prevalent in the researches related to cognitive and perceptual processes. The capability to detect causal or effective connectivity is highly desirable for understanding the cooperative nature of brain network, particularly when the ultimate goal is to obtain good performance of control-patient classification with biological meaningful interpretations. Understanding directed functional interactions between brain regions via brain connectivity network is a challenging task. Since many genetic and biomedical networks are intrinsically sparse, incorporating sparsity property into connectivity modeling can make the derived models more biologically plausible. Accordingly, we propose an effective connectivity modeling of resting-state fMRI data based on the multivariate autoregressive (MAR) modeling technique, which is widely used to characterize temporal information of dynamic systems. This MAR modeling technique allows for the identification of effective connectivity using the Granger causality concept and reducing the spurious causality connectivity in assessment of directed functional interaction from fMRI data. A forward orthogonal least squares (OLS) regression algorithm is further used to construct a sparse MAR model. By applying the proposed modeling to mild cognitive impairment (MCI) classification, we identify several most discriminative regions, including middle cingulate gyrus, posterior cingulate gyrus, lingual gyrus and caudate regions, in line with results reported in previous findings. A relatively high classification accuracy of 91.89 % is also achieved, with an increment of 5.4 % compared to the fully-connected, non-directional Pearson-correlation-based functional connectivity approach.
Production optimisation in the petrochemical industry by hierarchical multivariate modelling
Energy Technology Data Exchange (ETDEWEB)
Andersson, Magnus; Furusjoe, Erik; Jansson, Aasa
2004-06-01
This project demonstrates the advantages of applying hierarchical multivariate modelling in the petrochemical industry in order to increase knowledge of the total process. The models indicate possible ways to optimise the process regarding the use of energy and raw material, which is directly linked to the environmental impact of the process. The refinery of Nynaes Refining AB (Goeteborg, Sweden) has acted as a demonstration site in this project. The models developed for the demonstration site resulted in: Detection of an unknown process disturbance and suggestions of possible causes; Indications on how to increase the yield in combination with energy savings; The possibility to predict product quality from on-line process measurements, making the results available at a higher frequency than customary laboratory analysis; Quantification of the gradually lowered efficiency of heat transfer in the furnace and increased fuel consumption as an effect of soot build-up on the furnace coils; Increased knowledge of the relation between production rate and the efficiency of the heat exchangers. This report is one of two reports from the project. It contains a technical discussion of the result with some degree of detail. A shorter and more easily accessible report is also available, see IVL report B1586-A.
Production optimisation in the petrochemical industry by hierarchical multivariate modelling
Energy Technology Data Exchange (ETDEWEB)
Andersson, Magnus; Furusjoe, Erik; Jansson, Aasa
2004-06-01
This project demonstrates the advantages of applying hierarchical multivariate modelling in the petrochemical industry in order to increase knowledge of the total process. The models indicate possible ways to optimise the process regarding the use of energy and raw material, which is directly linked to the environmental impact of the process. The refinery of Nynaes Refining AB (Goeteborg, Sweden) has acted as a demonstration site in this project. The models developed for the demonstration site resulted in: Detection of an unknown process disturbance and suggestions of possible causes; Indications on how to increase the yield in combination with energy savings; The possibility to predict product quality from on-line process measurements, making the results available at a higher frequency than customary laboratory analysis; Quantification of the gradually lowered efficiency of heat transfer in the furnace and increased fuel consumption as an effect of soot build-up on the furnace coils; Increased knowledge of the relation between production rate and the efficiency of the heat exchangers. This report is one of two reports from the project. It contains a technical discussion of the result with some degree of detail. A shorter and more easily accessible report is also available, see IVL report B1586-A.
Introduction to multivariate analysis linear and nonlinear modeling
Konishi, Sadanori
2014-01-01
""The presentation is always clear and several examples and figures facilitate an easy understanding of all the techniques. The book can be used as a textbook in advanced undergraduate courses in multivariate analysis, and can represent a valuable reference manual for biologists and engineers working with multivariate datasets.""-Fabio Rapallo, Zentralblatt MATH 1296
Optimisation of Marine Boilers using Model-based Multivariable Control
DEFF Research Database (Denmark)
Solberg, Brian
Traditionally, marine boilers have been controlled using classical single loop controllers. To optimise marine boiler performance, reduce new installation time and minimise the physical dimensions of these large steel constructions, a more comprehensive and coherent control strategy is needed. Th......). In the thesis the pressure control is based on this new method when on/off burner switching is required while the water level control is handled by a model predictive controller........ This research deals with the application of advanced control to a specific class of marine boilers combining well-known design methods for multivariable systems. This thesis presents contributions for modelling and control of the one-pass smoke tube marine boilers as well as for hybrid systems control. Much...... of the focus has been directed towards water level control which is complicated by the nature of the disturbances acting on the system as well as by low frequency sensor noise. This focus was motivated by an estimated large potential to minimise the boiler geometry by reducing water level fluctuations...
Empirical likelihood ratio tests for multivariate regression models
Institute of Scientific and Technical Information of China (English)
WU Jianhong; ZHU Lixing
2007-01-01
This paper proposes some diagnostic tools for checking the adequacy of multivariate regression models including classical regression and time series autoregression. In statistical inference, the empirical likelihood ratio method has been well known to be a powerful tool for constructing test and confidence region. For model checking, however, the naive empirical likelihood (EL) based tests are not of Wilks' phenomenon. Hence, we make use of bias correction to construct the EL-based score tests and derive a nonparametric version of Wilks' theorem. Moreover, by the advantages of both the EL and score test method, the EL-based score tests share many desirable features as follows: They are self-scale invariant and can detect the alternatives that converge to the null at rate n-1/2, the possibly fastest rate for lack-of-fit testing; they involve weight functions, which provides us with the flexibility to choose scores for improving power performance, especially under directional alternatives. Furthermore, when the alternatives are not directional, we construct asymptotically distribution-free maximin tests for a large class of possible alternatives. A simulation study is carried out and an application for a real dataset is analyzed.
Multivariate models of inter-subject anatomical variability.
Ashburner, John; Klöppel, Stefan
2011-05-15
This paper presents a very selective review of some of the approaches for multivariate modelling of inter-subject variability among brain images. It focusses on applying probabilistic kernel-based pattern recognition approaches to pre-processed anatomical MRI, with the aim of most accurately modelling the difference between populations of subjects. Some of the principles underlying the pattern recognition approaches of Gaussian process classification and regression are briefly described, although the reader is advised to look elsewhere for full implementational details. Kernel pattern recognition methods require matrices that encode the degree of similarity between the images of each pair of subjects. This review focusses on similarity measures derived from the relative shapes of the subjects' brains. Pre-processing is viewed as generative modelling of anatomical variability, and there is a special emphasis on the diffeomorphic image registration framework, which provides a very parsimonious representation of relative shapes. Although the review is largely methodological, excessive mathematical notation is avoided as far as possible, as the paper attempts to convey a more intuitive understanding of various concepts. The paper should be of interest to readers wishing to apply pattern recognition methods to MRI data, with the aim of clinical diagnosis or biomarker development. It also tries to explain that the best models are those that most accurately predict, so similar approaches should also be relevant to basic science. Knowledge of some basic linear algebra and probability theory should make the review easier to follow, although it may still have something to offer to those readers whose mathematics may be more limited. Copyright © 2010 Elsevier Inc. All rights reserved.
Multivariable parametric cost model for space and ground telescopes
Stahl, H. Philip; Henrichs, Todd
2016-09-01
Parametric cost models can be used by designers and project managers to perform relative cost comparisons between major architectural cost drivers and allow high-level design trades; enable cost-benefit analysis for technology development investment; and, provide a basis for estimating total project cost between related concepts. This paper hypothesizes a single model, based on published models and engineering intuition, for both ground and space telescopes: OTA Cost (X) D (1.75 +/- 0.05) λ (-0.5 +/- 0.25) T-0.25 e (-0.04) Y Specific findings include: space telescopes cost 50X to 100X more ground telescopes; diameter is the most important CER; cost is reduced by approximately 50% every 20 years (presumably because of technology advance and process improvements); and, for space telescopes, cost associated with wavelength performance is balanced by cost associated with operating temperature. Finally, duplication only reduces cost for the manufacture of identical systems (i.e. multiple aperture sparse arrays or interferometers). And, while duplication does reduce the cost of manufacturing the mirrors of segmented primary mirror, this cost savings does not appear to manifest itself in the final primary mirror assembly (presumably because the structure for a segmented mirror is more complicated than for a monolithic mirror).
Xuan Chi; Barry Goodwin
2012-01-01
Spatial and temporal relationships among agricultural prices have been an important topic of applied research for many years. Such research is used to investigate the performance of markets and to examine linkages up and down the marketing chain. This research has empirically evaluated price linkages by using correlation and regression models and, later, linear and...
BayesDccGarch - An Implementation of Multivariate GARCH DCC Models
Fioruci, Jose A.; Ehlers, Ricardo S.; Louzada, Francisco
2014-01-01
Multivariate GARCH models are important tools to describe the dynamics of multivariate times series of financial returns. Nevertheless, these models have been much less used in practice due to the lack of reliable software. This paper describes the {\\tt R} package {\\bf BayesDccGarch} which was developed to implement recently proposed inference procedures to estimate and compare multivariate GARCH models allowing for asymmetric and heavy tailed distributions.
Smooth-Threshold Multivariate Genetic Prediction with Unbiased Model Selection.
Ueki, Masao; Tamiya, Gen
2016-04-01
We develop a new genetic prediction method, smooth-threshold multivariate genetic prediction, using single nucleotide polymorphisms (SNPs) data in genome-wide association studies (GWASs). Our method consists of two stages. At the first stage, unlike the usual discontinuous SNP screening as used in the gene score method, our method continuously screens SNPs based on the output from standard univariate analysis for marginal association of each SNP. At the second stage, the predictive model is built by a generalized ridge regression simultaneously using the screened SNPs with SNP weight determined by the strength of marginal association. Continuous SNP screening by the smooth thresholding not only makes prediction stable but also leads to a closed form expression of generalized degrees of freedom (GDF). The GDF leads to the Stein's unbiased risk estimation (SURE), which enables data-dependent choice of optimal SNP screening cutoff without using cross-validation. Our method is very rapid because computationally expensive genome-wide scan is required only once in contrast to the penalized regression methods including lasso and elastic net. Simulation studies that mimic real GWAS data with quantitative and binary traits demonstrate that the proposed method outperforms the gene score method and genomic best linear unbiased prediction (GBLUP), and also shows comparable or sometimes improved performance with the lasso and elastic net being known to have good predictive ability but with heavy computational cost. Application to whole-genome sequencing (WGS) data from the Alzheimer's Disease Neuroimaging Initiative (ADNI) exhibits that the proposed method shows higher predictive power than the gene score and GBLUP methods.
Extensions to Multivariate Space Time Mixture Modeling of Small Area Cancer Data
Directory of Open Access Journals (Sweden)
Rachel Carroll
2017-05-01
Full Text Available Oral cavity and pharynx cancer, even when considered together, is a fairly rare disease. Implementation of multivariate modeling with lung and bronchus cancer, as well as melanoma cancer of the skin, could lead to better inference for oral cavity and pharynx cancer. The multivariate structure of these models is accomplished via the use of shared random effects, as well as other multivariate prior distributions. The results in this paper indicate that care should be taken when executing these types of models, and that multivariate mixture models may not always be the ideal option, depending on the data of interest.
Multivariate Hawkes process models of the occurrence of regulatory elements
DEFF Research Database (Denmark)
Carstensen, L; Sandelin, A; Winther, Ole
2010-01-01
BACKGROUND: A central question in molecular biology is how transcriptional regulatory elements (TREs) act in combination. Recent high-throughput data provide us with the location of multiple regulatory regions for multiple regulators, and thus with the possibility of analyzing the multivariate di...
Multivariate models to classify Tuscan virgin olive oils by zone.
Directory of Open Access Journals (Sweden)
Alessandri, Stefano
1999-10-01
Full Text Available In order to study and classify Tuscan virgin olive oils, 179 samples were collected. They were obtained from drupes harvested during the first half of November, from three different zones of the Region. The sampling was repeated for 5 years. Fatty acids, phytol, aliphatic and triterpenic alcohols, triterpenic dialcohols, sterols, squalene and tocopherols were analyzed. A subset of variables was considered. They were selected in a preceding work as the most effective and reliable, from the univariate point of view. The analytical data were transformed (except for the cycloartenol to compensate annual variations, the mean related to the East zone was subtracted from each value, within each year. Univariate three-class models were calculated and further variables discarded. Then multivariate three-zone models were evaluated, including phytol (that was always selected and all the combinations of palmitic, palmitoleic and oleic acid, tetracosanol, cycloartenol and squalene. Models including from two to seven variables were studied. The best model shows by-zone classification errors less than 40%, by-zone within-year classification errors that are less than 45% and a global classification error equal to 30%. This model includes phytol, palmitic acid, tetracosanol and cycloartenol.
Para estudiar y clasificar aceites de oliva vírgenes Toscanos, se utilizaron 179 muestras, que fueron obtenidas de frutos recolectados durante la primera mitad de Noviembre, de tres zonas diferentes de la Región. El muestreo fue repetido durante 5 años. Se analizaron ácidos grasos, fitol, alcoholes alifáticos y triterpénicos, dialcoholes triterpénicos, esteroles, escualeno y tocoferoles. Se consideró un subconjunto de variables que fueron seleccionadas en un trabajo anterior como el más efectivo y fiable, desde el punto de vista univariado. Los datos analíticos se transformaron (excepto para el cicloartenol para compensar las variaciones anuales, rest
Ecological prediction with nonlinear multivariate time-frequency functional data models
Yang, Wen-Hsi; Wikle, Christopher K.; Holan, Scott H.; Wildhaber, Mark L.
2013-01-01
Time-frequency analysis has become a fundamental component of many scientific inquiries. Due to improvements in technology, the amount of high-frequency signals that are collected for ecological and other scientific processes is increasing at a dramatic rate. In order to facilitate the use of these data in ecological prediction, we introduce a class of nonlinear multivariate time-frequency functional models that can identify important features of each signal as well as the interaction of signals corresponding to the response variable of interest. Our methodology is of independent interest and utilizes stochastic search variable selection to improve model selection and performs model averaging to enhance prediction. We illustrate the effectiveness of our approach through simulation and by application to predicting spawning success of shovelnose sturgeon in the Lower Missouri River.
Multivariate Hawkes process models of the occurrence of regulatory elements
DEFF Research Database (Denmark)
Carstensen, L; Sandelin, A; Winther, Ole;
2010-01-01
BACKGROUND: A central question in molecular biology is how transcriptional regulatory elements (TREs) act in combination. Recent high-throughput data provide us with the location of multiple regulatory regions for multiple regulators, and thus with the possibility of analyzing the multivariate di...... occurrences of multiple TREs along the genome that is capable of providing new insights into dependencies among elements involved in transcriptional regulation. The method is available as an R package from http://www.math.ku.dk/~richard/ppstat/.......BACKGROUND: A central question in molecular biology is how transcriptional regulatory elements (TREs) act in combination. Recent high-throughput data provide us with the location of multiple regulatory regions for multiple regulators, and thus with the possibility of analyzing the multivariate...
Multivariate model of female black bear habitat use for a Geographic Information System
Clark, Joseph D.; Dunn, James E.; Smith, Kimberly G.
1993-01-01
Simple univariate statistical techniques may not adequately assess the multidimensional nature of habitats used by wildlife. Thus, we developed a multivariate method to model habitat-use potential using a set of female black bear (Ursus americanus) radio locations and habitat data consisting of forest cover type, elevation, slope, aspect, distance to roads, distance to streams, and forest cover type diversity score in the Ozark Mountains of Arkansas. The model is based on the Mahalanobis distance statistic coupled with Geographic Information System (GIS) technology. That statistic is a measure of dissimilarity and represents a standardized squared distance between a set of sample variates and an ideal based on the mean of variates associated with animal observations. Calculations were made with the GIS to produce a map containing Mahalanobis distance values within each cell on a 60- × 60-m grid. The model identified areas of high habitat use potential that could not otherwise be identified by independent perusal of any single map layer. This technique avoids many pitfalls that commonly affect typical multivariate analyses of habitat use and is a useful tool for habitat manipulation or mitigation to favor terrestrial vertebrates that use habitats on a landscape scale.
Haberman, Shelby J.; von Davier, Matthias; Lee, Yi-Hsuan
2008-01-01
Multidimensional item response models can be based on multivariate normal ability distributions or on multivariate polytomous ability distributions. For the case of simple structure in which each item corresponds to a unique dimension of the ability vector, some applications of the two-parameter logistic model to empirical data are employed to…
Measuring inflation persistence in Brazil using a multivariate model
Directory of Open Access Journals (Sweden)
Vicente da Gama Machado
2014-06-01
Full Text Available We estimate inflation persistence in Brazil in a multivariate framework of unobserved components, accounting for the following sources affecting inflation persistence: Deviations of expectations from the actual policy target; persistence of the factors driving inflation; and the usual intrinsic measure of persistence, evaluated through lagged inflation terms. Data on inflation, output and interest rates are decomposed into unobserved components. To simplify the estimation of a great number of unknown variables, we employ Bayesian analysis. Our results indicate that expectations-based persistence matters considerably for inflation persistence in Brazil.
Cheng, Jun-Hu; Nicolai, Bart; Sun, Da-Wen
2017-01-01
Muscle foods are very important for a well-balanced daily diet. Due to their perishability and vulnerability, there is a need for quality and safety evaluation of such foods. Hyperspectral imaging (HSI) coupled with multivariate analysis is becoming increasingly popular for the non-destructive, non-invasive, and rapid determination of important quality attributes and the classification of muscle foods. This paper reviews recent advances of application of HSI for predicting some significant muscle foods parameters, including color, tenderness, firmness, springiness, water-holding capacity, drip loss and pH. In addition, algorithms for the rapid classification of muscle foods are also reported and discussed. It will be shown that this technology has great potential to replace traditional analytical methods for predicting various quality parameters and classifying muscle foods. Copyright Â© 2016 Elsevier Ltd. All rights reserved.
Multi-Variable Model-Based Parameter Estimation Model for Antenna Radiation Pattern Prediction
Deshpande, Manohar D.; Cravey, Robin L.
2002-01-01
A new procedure is presented to develop multi-variable model-based parameter estimation (MBPE) model to predict far field intensity of antenna. By performing MBPE model development procedure on a single variable at a time, the present method requires solution of smaller size matrices. The utility of the present method is demonstrated by determining far field intensity due to a dipole antenna over a frequency range of 100-1000 MHz and elevation angle range of 0-90 degrees.
Multivariate zero-inflated modeling with latent predictors: Modeling feedback behavior
Fox, Gerardus J.A.
2013-01-01
In educational studies, the use of computer-based assessments leads to the collection of multiple outcomes to assess student performance. The student-specific outcomes are correlated and often measured in different scales, such as continuous and count outcomes. A multivariate zero-inflated model
Alessi, Lucia; Barigozzi, Matteo; Capasso, Marco
2006-01-01
We propose a new model for multivariate forecasting which combines the Generalized Dynamic Factor Model (GDFM)and the GARCH model. The GDFM, applied to a huge number of series, captures the multivariate information and disentangles the common and the idiosyncratic part of each series of returns. In this financial analysis, both these components are modeled as a GARCH. We compare GDFM+GARCH and standard GARCH performance on samples up to 475 series, predicting both levels and volatility of ret...
Computer Program for Estimation Multivariate Volatility Processes Using DVEC Model of CRM
Directory of Open Access Journals (Sweden)
Jelena Z. Minović
2008-12-01
Full Text Available This article presents computer program for estimation of multivariate (bivariate and trivariate volatility processes, written in EViews Version 4.1. In order to estimate multivariate volatility processes for analysis of the Serbian financial market, I had to write new subprograms within Eviews software package. The programs are written for the diagonal vector ARCH model (DVEC in bivariate and trivariate versions.
Modelling world gold prices and USD foreign exchange relationship using multivariate GARCH model
Ping, Pung Yean; Ahmad, Maizah Hura Binti
2014-12-01
World gold price is a popular investment commodity. The series have often been modeled using univariate models. The objective of this paper is to show that there is a co-movement between gold price and USD foreign exchange rate. Using the effect of the USD foreign exchange rate on the gold price, a model that can be used to forecast future gold prices is developed. For this purpose, the current paper proposes a multivariate GARCH (Bivariate GARCH) model. Using daily prices of both series from 01.01.2000 to 05.05.2014, a causal relation between the two series understudied are found and a bivariate GARCH model is produced.
Modelling and Multi-Variable Control of Refrigeration Systems
DEFF Research Database (Denmark)
Larsen, Lars Finn Slot; Holm, J. R.
2003-01-01
In this paper a dynamic model of a 1:1 refrigeration system is presented. The main modelling effort has been concentrated on a lumped parameter model of a shell and tube condenser. The model has shown good resemblance with experimental data from a test rig, regarding as well the static as the dyn......In this paper a dynamic model of a 1:1 refrigeration system is presented. The main modelling effort has been concentrated on a lumped parameter model of a shell and tube condenser. The model has shown good resemblance with experimental data from a test rig, regarding as well the static...
Wu, Jiao; Liu, Fang; Jiao, L C; Wang, Xiaodong; Hou, Biao
2011-12-01
Most wavelet-based reconstruction methods of compressive sensing (CS) are developed under the independence assumption of the wavelet coefficients. However, the wavelet coefficients of images have significant statistical dependencies. Lots of multivariate prior models for the wavelet coefficients of images have been proposed and successfully applied to the image estimation problems. In this paper, the statistical structures of the wavelet coefficients are considered for CS reconstruction of images that are sparse or compressive in wavelet domain. A multivariate pursuit algorithm (MPA) based on the multivariate models is developed. Several multivariate scale mixture models are used as the prior distributions of MPA. Our method reconstructs the images by means of modeling the statistical dependencies of the wavelet coefficients in a neighborhood. The proposed algorithm based on these scale mixture models provides superior performance compared with many state-of-the-art compressive sensing reconstruction algorithms.
Estimation of a multivariate mean under model selection uncertainty
Directory of Open Access Journals (Sweden)
Georges Nguefack-Tsague
2014-05-01
Full Text Available Model selection uncertainty would occur if we selected a model based on one data set and subsequently applied it for statistical inferences, because the "correct" model would not be selected with certainty. When the selection and inference are based on the same dataset, some additional problems arise due to the correlation of the two stages (selection and inference. In this paper model selection uncertainty is considered and model averaging is proposed. The proposal is related to the theory of James and Stein of estimating more than three parameters from independent normal observations. We suggest that a model averaging scheme taking into account the selection procedure could be more appropriate than model selection alone. Some properties of this model averaging estimator are investigated; in particular we show using Stein's results that it is a minimax estimator and can outperform Stein-type estimators.
An Asymmetric Block Dynamic Conditional Correlation Multivariate GARCH Model
Vargas, Gregorio A.
2006-01-01
The Block DCC model for determining dynamic correlations within and between groups of financial asset returns is extended to account for asymmetric effects. Simulation results show that the Asymmetric Block DCC model is competitive in in-sample forecasting and performs better than alternative DCC models in out-of-sample forecasting of conditional correlation in the presence of asymmetric effect between blocks of asset returns. Empirical results demonstrate that the model is able to capture ...
National Research Council Canada - National Science Library
Wided Ben Moussa
2014-01-01
This paper uses a multivariate GARCH modelling to describe the relationship between the systemic risk and the stock return in the banking industry in Thailand, Malaysia, Korea, Indonesia and Philippines...
Xu, Cheng-Jian; van der Schaaf, Arjen; Schilstra, Cornelis; Langendijk, Johannes A.; van t Veld, Aart A.
2012-01-01
PURPOSE: To study the impact of different statistical learning methods on the prediction performance of multivariate normal tissue complication probability (NTCP) models. METHODS AND MATERIALS: In this study, three learning methods, stepwise selection, least absolute shrinkage and selection operator
Landslide-exposed areas modeling using the multivariate analysis
2004-01-01
Landslide occurrence is governed by numerous spatial and temporal factors that can be divided into the causes and the triggers. Human interaction with the environment coincides mainly with the triggers, which are also of natural origin. For a better understanding of the causing factors, which mainly influence the spatial distribution, several methods based on GIS technology are used. Results, derived from these methods, define areas that are more exposed to triggering factors, consequencially...
Preference learning with evolutionary Multivariate Adaptive Regression Spline model
DEFF Research Database (Denmark)
Abou-Zleikha, Mohamed; Shaker, Noor; Christensen, Mads Græsbøll
2015-01-01
for human decision making. Learning models from pairwise preference data is however an NP-hard problem. Therefore, constructing models that can effectively learn such data is a challenging task. Models are usually constructed with accuracy being the most important factor. Another vitally important aspect...... that is usually given less attention is expressiveness, i.e. how easy it is to explain the relationship between the model input and output. Most machine learning techniques are focused either on performance or on expressiveness. This paper employ MARS models which have the advantage of being a powerful method...
Admissibilities of linear estimator in a class of linear models with a multivariate t error variable
Institute of Scientific and Technical Information of China (English)
无
2010-01-01
This paper discusses admissibilities of estimators in a class of linear models,which include the following common models:the univariate and multivariate linear models,the growth curve model,the extended growth curve model,the seemingly unrelated regression equations,the variance components model,and so on.It is proved that admissible estimators of functions of the regression coefficient β in the class of linear models with multivariate t error terms,called as Model II,are also ones in the case that error terms have multivariate normal distribution under a strictly convex loss function or a matrix loss function.It is also proved under Model II that the usual estimators of β are admissible for p 2 with a quadratic loss function,and are admissible for any p with a matrix loss function,where p is the dimension of β.
Evaluation of multivariate calibration models transferred between spectroscopic instruments
DEFF Research Database (Denmark)
Eskildsen, Carl Emil Aae; Hansen, Per W.; Skov, Thomas
2016-01-01
In a setting where multiple spectroscopic instruments are used for the same measurements it may be convenient to develop the calibration model on a single instrument and then transfer this model to the other instruments. In the ideal scenario, all instruments provide the same predictions for the ......In a setting where multiple spectroscopic instruments are used for the same measurements it may be convenient to develop the calibration model on a single instrument and then transfer this model to the other instruments. In the ideal scenario, all instruments provide the same predictions...... for the same samples using the transferred model. However, sometimes the success of a model transfer is evaluated by comparing the transferred model predictions with the reference values. This is not optimal, as uncertainties in the reference method will impact the evaluation. This paper proposes a new method...... for calibration model transfer evaluation. The new method is based on comparing predictions from different instruments, rather than comparing predictions and reference values. A total of 75 flour samples were available for the study. All samples were measured on ten near infrared (NIR) instruments from two...
Nonlinear Latent Curve Models for Multivariate Longitudinal Data
Blozis, Shelley A.; Conger, Katherine J.; Harring, Jeffrey R.
2007-01-01
Latent curve models have become a useful approach to analyzing longitudinal data, due in part to their allowance of and emphasis on individual differences in features that describe change. Common applications of latent curve models in developmental studies rely on polynomial functions, such as linear or quadratic functions. Although useful for…
A multivariate approach to modeling univariate seasonal time series
Ph.H.B.F. Franses (Philip Hans)
1994-01-01
textabstractA seasonal time series can be represented by a vector autoregressive model for the annual series containing the seasonal observations. This model allows for periodically varying coefficients. When the vector elements are integrated, the maximum likelihood cointegration method can be used
Tang, An-Min; Tang, Nian-Sheng
2015-02-28
We propose a semiparametric multivariate skew-normal joint model for multivariate longitudinal and multivariate survival data. One main feature of the posited model is that we relax the commonly used normality assumption for random effects and within-subject error by using a centered Dirichlet process prior to specify the random effects distribution and using a multivariate skew-normal distribution to specify the within-subject error distribution and model trajectory functions of longitudinal responses semiparametrically. A Bayesian approach is proposed to simultaneously obtain Bayesian estimates of unknown parameters, random effects and nonparametric functions by combining the Gibbs sampler and the Metropolis-Hastings algorithm. Particularly, a Bayesian local influence approach is developed to assess the effect of minor perturbations to within-subject measurement error and random effects. Several simulation studies and an example are presented to illustrate the proposed methodologies.
Reuter, M; Netter, P
2001-01-01
The present study proposes a hierarchical multivariate statistical prediction model which enables to determine the most prominent variables (physiological, biochemical and personality factors) related to nicotine craving and dopaminergic activation. Based on animal studies reporting a reduction of the rewarding effects of psychotropic drugs after blockade or destruction of the mesolimbic dopamine (DA) system, changes in nicotine craving after pharmacological manipulation by means of a DA agonist (lisuride 0.2 mg) and a DA antagonist (fluphenazine 2 mg) were assessed in 36 healthy male heavy smokers. The major aim was the development of a multivariate prediction model which is applicable in samples lacking variance homogeneity or the prerequisite of a multivariate normal distribution. The model proposed is a combination of multivariate parametric and nonparametric methods taking advantage of their individual merits. Especially personality variables, such as sensation seeking, impulsivity, and neuroticism showed to be important predictors of craving in this responder approach.
Djorgovski, S. G.
1994-01-01
We developed a package to process and analyze the data from the digital version of the Second Palomar Sky Survey. This system, called SKICAT, incorporates the latest in machine learning and expert systems software technology, in order to classify the detected objects objectively and uniformly, and facilitate handling of the enormous data sets from digital sky surveys and other sources. The system provides a powerful, integrated environment for the manipulation and scientific investigation of catalogs from virtually any source. It serves three principal functions: image catalog construction, catalog management, and catalog analysis. Through use of the GID3* Decision Tree artificial induction software, SKICAT automates the process of classifying objects within CCD and digitized plate images. To exploit these catalogs, the system also provides tools to merge them into a large, complex database which may be easily queried and modified when new data or better methods of calibrating or classifying become available. The most innovative feature of SKICAT is the facility it provides to experiment with and apply the latest in machine learning technology to the tasks of catalog construction and analysis. SKICAT provides a unique environment for implementing these tools for any number of future scientific purposes. Initial scientific verification and performance tests have been made using galaxy counts and measurements of galaxy clustering from small subsets of the survey data, and a search for very high redshift quasars. All of the tests were successful and produced new and interesting scientific results. Attachments to this report give detailed accounts of the technical aspects of the SKICAT system, and of some of the scientific results achieved to date. We also developed a user-friendly package for multivariate statistical analysis of small and moderate-size data sets, called STATPROG. The package was tested extensively on a number of real scientific applications and has
Djorgovski, S. George
1994-01-01
We developed a package to process and analyze the data from the digital version of the Second Palomar Sky Survey. This system, called SKICAT, incorporates the latest in machine learning and expert systems software technology, in order to classify the detected objects objectively and uniformly, and facilitate handling of the enormous data sets from digital sky surveys and other sources. The system provides a powerful, integrated environment for the manipulation and scientific investigation of catalogs from virtually any source. It serves three principal functions: image catalog construction, catalog management, and catalog analysis. Through use of the GID3* Decision Tree artificial induction software, SKICAT automates the process of classifying objects within CCD and digitized plate images. To exploit these catalogs, the system also provides tools to merge them into a large, complete database which may be easily queried and modified when new data or better methods of calibrating or classifying become available. The most innovative feature of SKICAT is the facility it provides to experiment with and apply the latest in machine learning technology to the tasks of catalog construction and analysis. SKICAT provides a unique environment for implementing these tools for any number of future scientific purposes. Initial scientific verification and performance tests have been made using galaxy counts and measurements of galaxy clustering from small subsets of the survey data, and a search for very high redshift quasars. All of the tests were successful, and produced new and interesting scientific results. Attachments to this report give detailed accounts of the technical aspects for multivariate statistical analysis of small and moderate-size data sets, called STATPROG. The package was tested extensively on a number of real scientific applications, and has produced real, published results.
Multivariate Modelling of Extreme Load Combinations for Wind Turbines
DEFF Research Database (Denmark)
Dimitrov, Nikolay Krasimirov
2015-01-01
We demonstrate a model for estimating the joint probability distribution of two load components acting on a wind turbine blade cross section. The model addresses the problem of modelling the probability distribution of load time histories with large periodic components by dividing the signal...... into a periodic part and a perturbation term, where each part has a known probability distribution. The proposed model shows good agreement with simulated data under stationary conditions, and a design load envelope based on this model is comparable to the load envelope estimated using the standard procedure...... for determining contemporaneous loads. By defining a joint probability distribution and full return-period contours for multiple load components, the suggested procedure gives the possibility for determining the most critical loading direction in a blade cross section, or for carrying out reliability analysis...
Functionally unidimensional item response models for multivariate binary data
DEFF Research Database (Denmark)
Ip, Edward; Molenberghs, Geert; Chen, Shyh-Huei;
2013-01-01
The problem of fitting unidimensional item response models to potentially multidimensional data has been extensively studied. The focus of this article is on response data that have a strong dimension but also contain minor nuisance dimensions. Fitting a unidimensional model to such multidimensio......The problem of fitting unidimensional item response models to potentially multidimensional data has been extensively studied. The focus of this article is on response data that have a strong dimension but also contain minor nuisance dimensions. Fitting a unidimensional model...... to such multidimensional data is believed to result in ability estimates that represent a combination of the major and minor dimensions. We conjecture that the underlying dimension for the fitted unidimensional model, which we call the functional dimension, represents a nonlinear projection. In this article we investigate...... tool. An example regarding a construct of desire for physical competency is used to illustrate the functional unidimensional approach....
Wind Speed Prediction Using a Univariate ARIMA Model and a Multivariate NARX Model
Directory of Open Access Journals (Sweden)
Erasmo Cadenas
2016-02-01
Full Text Available Two on step ahead wind speed forecasting models were compared. A univariate model was developed using a linear autoregressive integrated moving average (ARIMA. This method’s performance is well studied for a large number of prediction problems. The other is a multivariate model developed using a nonlinear autoregressive exogenous artificial neural network (NARX. This uses the variables: barometric pressure, air temperature, wind direction and solar radiation or relative humidity, as well as delayed wind speed. Both models were developed from two databases from two sites: an hourly average measurements database from La Mata, Oaxaca, Mexico, and a ten minute average measurements database from Metepec, Hidalgo, Mexico. The main objective was to compare the impact of the various meteorological variables on the performance of the multivariate model of wind speed prediction with respect to the high performance univariate linear model. The NARX model gave better results with improvements on the ARIMA model of between 5.5% and 10. 6% for the hourly database and of between 2.3% and 12.8% for the ten minute database for mean absolute error and mean squared error, respectively.
A multivariate model of stakeholder preference for lethal cat management.
Wald, Dara M; Jacobson, Susan K
2014-01-01
Identifying stakeholder beliefs and attitudes is critical for resolving management conflicts. Debate over outdoor cat management is often described as a conflict between two groups, environmental advocates and animal welfare advocates, but little is known about the variables predicting differences among these critical stakeholder groups. We administered a mail survey to randomly selected stakeholders representing both of these groups (n=1,596) in Florida, where contention over the management of outdoor cats has been widespread. We used a structural equation model to evaluate stakeholder intention to support non-lethal management. The cognitive hierarchy model predicted that values influenced beliefs, which predicted general and specific attitudes, which in turn, influenced behavioral intentions. We posited that specific attitudes would mediate the effect of general attitudes, beliefs, and values on management support. Model fit statistics suggested that the final model fit the data well (CFI=0.94, RMSEA=0.062). The final model explained 74% of the variance in management support, and positive attitudes toward lethal management (humaneness) had the largest direct effect on management support. Specific attitudes toward lethal management and general attitudes toward outdoor cats mediated the relationship between positive (pstakeholder intention to support non-lethal cat management. Our findings suggest that stakeholders can simultaneously perceive both positive and negative beliefs about outdoor cats, which influence attitudes toward and support for non-lethal management.
Functionally Unidimensional Item Response Models for Multivariate Binary Data.
Ip, Edward H; Molenberghs, Geert; Chen, Shyh-Huei; Goegebeur, Yuri; De Boeck, Paul
2013-07-01
The problem of fitting unidimensional item response models to potentially multidimensional data has been extensively studied. The focus of this article is on response data that have a strong dimension but also contain minor nuisance dimensions. Fitting a unidimensional model to such multidimensional data is believed to result in ability estimates that represent a combination of the major and minor dimensions. We conjecture that the underlying dimension for the fitted unidimensional model, which we call the functional dimension, represents a nonlinear projection. In this article we investigate 2 issues: (a) can a proposed nonlinear projection track the functional dimension well, and (b) what are the biases in the ability estimate and the associated standard error when estimating the functional dimension? To investigate the second issue, the nonlinear projection is used as an evaluative tool. An example regarding a construct of desire for physical competency is used to illustrate the functional unidimensional approach.
Modeling inflation rates and exchange rates in Ghana: application of multivariate GARCH models
Nortey, Ezekiel NN; Ngoh, Delali D; Doku-Amponsah, Kwabena; Ofori-Boateng, Kenneth
2015-01-01
This paper was aimed at investigating the volatility and conditional relationship among inflation rates, exchange rates and interest rates as well as to construct a model using multivariate GARCH DCC and BEKK models using Ghana data from January 1990 to December 2013. The study revealed that the cumulative depreciation of the cedi to the US dollar from 1990 to 2013 is 7,010.2% and the yearly weighted depreciation of the cedi to the US dollar for the period is 20.4%. There was evidence that, t...
Modeling inflation rates and exchange rates in Ghana: application of multivariate GARCH models
Nortey, Ezekiel NN; Ngoh, Delali D; Doku-Amponsah, Kwabena; Ofori-Boateng, Kenneth
2015-01-01
This paper was aimed at investigating the volatility and conditional relationship among inflation rates, exchange rates and interest rates as well as to construct a model using multivariate GARCH DCC and BEKK models using Ghana data from January 1990 to December 2013. The study revealed that the cumulative depreciation of the cedi to the US dollar from 1990 to 2013 is 7,010.2% and the yearly weighted depreciation of the cedi to the US dollar for the period is 20.4%. There was evidence that, t...
Multi-model unfalsified switching control of uncertain multivariable systems
Baldi, S.; Battistelli, G.; Mari, D.; Mosca, E.; Tesi, P.
2012-01-01
This paper addresses the problem of controlling an uncertain multi-input multi-output system by means of adaptive switching control schemes. In particular, the paper aims at extending the multi-model unfalsified control approach, so far restricted to single-input single-output systems, to a general
Multiple-Model Adaptive Switching Control for Uncertain Multivariable Systems
Baldi, Simone; Battistelli, Giorgio; Mari, Daniele; Mosca, Edoardo; Tesi, Pietro
2011-01-01
This paper addresses the problem of controlling an uncertain multi-input multi-output (MIMO) system by means of adaptive switching control schemes. In particular, the paper aims at extending the approach of multiple-model unfalsified adaptive switched control, so far restricted to single-input singl
Linear models for multivariate, time series, and spatial data
Christensen, Ronald
1991-01-01
This is a companion volume to Plane Answers to Complex Questions: The Theory 0/ Linear Models. It consists of six additional chapters written in the same spirit as the last six chapters of the earlier book. Brief introductions are given to topics related to linear model theory. No attempt is made to give a comprehensive treatment of the topics. Such an effort would be futile. Each chapter is on a topic so broad that an in depth discussion would require a book-Iength treatment. People need to impose structure on the world in order to understand it. There is a limit to the number of unrelated facts that anyone can remem ber. If ideas can be put within a broad, sophisticatedly simple structure, not only are they easier to remember but often new insights become avail able. In fact, sophisticatedly simple models of the world may be the only ones that work. I have often heard Arnold Zellner say that, to the best of his knowledge, this is true in econometrics. The process of modeling is fundamental to understand...
Testing for causality in variance using multivariate GARCH models
C.M. Hafner (Christian); H. Herwartz
2004-01-01
textabstractTests of causality in variance in multiple time series have been proposed recently, based on residuals of estimated univariate models. Although such tests are applied frequently little is known about their power properties. In this paper we show that a convenient alternative to residual
DEFF Research Database (Denmark)
Silvennoinen, Annastiina; Teräsvirta, Timo
In this paper we propose a multivariate GARCH model with a time-varying conditional correlation structure. The new Double Smooth Transition Conditional Correlation GARCH model extends the Smooth Transition Conditional Correlation GARCH model of Silvennoinen and Ter¨asvirta (2005) by including...... another variable according to which the correlations change smoothly between states of constant correlations. A Lagrange multiplier test is derived to test the constancy of correlations against the DSTCC-GARCH model, and another one to test for another transition in the STCC-GARCH framework. In addition......, other specification tests, with the aim of aiding the model building procedure, are considered. Analytical expressions for the test statistics and the required derivatives are provided. The model is applied to a selection of world stock indices, and it is found that time is an important factor affecting...
Multivariate Modelling of the Career Intent of Air Force Personnel.
1980-09-01
motivation, Victor H . Vroom developed in 1964 what is known today as the first integrated model of expectancy theory. Adding the historical perspective...LSSR 72-80. Vroom , Victor H . Work and Motivation. New York: Wiley, 1964. Vrooman, Roger M. "An Analysis of Factors Associated With the Job...results of the research may, in fact, be ip nt. Whether or not you were able to establish an equivalent value for h research (3 above), what is your
Multivariate Models for Prediction of Human Skin Sensitization ...
One of the lnteragency Coordinating Committee on the Validation of Alternative Method's (ICCVAM) top priorities is the development and evaluation of non-animal approaches to identify potential skin sensitizers. The complexity of biological events necessary to produce skin sensitization suggests that no single alternative method will replace the currently accepted animal tests. ICCVAM is evaluating an integrated approach to testing and assessment based on the adverse outcome pathway for skin sensitization that uses machine learning approaches to predict human skin sensitization hazard. We combined data from three in chemico or in vitro assays - the direct peptide reactivity assay (DPRA), human cell line activation test (h-CLAT) and KeratinoSens TM assay - six physicochemical properties and an in silico read-across prediction of skin sensitization hazard into 12 variable groups. The variable groups were evaluated using two machine learning approaches , logistic regression and support vector machine, to predict human skin sensitization hazard. Models were trained on 72 substances and tested on an external set of 24 substances. The six models (three logistic regression and three support vector machine) with the highest accuracy (92%) used: (1) DPRA, h-CLAT and read-across; (2) DPRA, h-CLAT, read-across and KeratinoSens; or (3) DPRA, h-CLAT, read-across, KeratinoSens and log P. The models performed better at predicting human skin sensitization hazard than the murine
Can multivariate models based on MOAKS predict OA knee pain? Data from the Osteoarthritis Initiative
Luna-Gómez, Carlos D.; Zanella-Calzada, Laura A.; Galván-Tejada, Jorge I.; Galván-Tejada, Carlos E.; Celaya-Padilla, José M.
2017-03-01
Osteoarthritis is the most common rheumatic disease in the world. Knee pain is the most disabling symptom in the disease, the prediction of pain is one of the targets in preventive medicine, this can be applied to new therapies or treatments. Using the magnetic resonance imaging and the grading scales, a multivariate model based on genetic algorithms is presented. Using a predictive model can be useful to associate minor structure changes in the joint with the future knee pain. Results suggest that multivariate models can be predictive with future knee chronic pain. All models; T0, T1 and T2, were statistically significant, all p values were 0.60.
Strained and unconstrained multivariate normal finite mixture modeling of Piagetian data.
Dolan, C.V.; Jansen, B.R.J.; van der Maas, H.L.J.
2004-01-01
We present the results of multivariate normal mixture modeling of Piagetian data. The sample consists of 101 children, who carried out a (pseudo-)conservation computer task on four occasions. We fitted both cross-sectional mixture models, and longitudinal models based on a Markovian transition
A Sandwich-Type Standard Error Estimator of SEM Models with Multivariate Time Series
Zhang, Guangjian; Chow, Sy-Miin; Ong, Anthony D.
2011-01-01
Structural equation models are increasingly used as a modeling tool for multivariate time series data in the social and behavioral sciences. Standard error estimators of SEM models, originally developed for independent data, require modifications to accommodate the fact that time series data are inherently dependent. In this article, we extend a…
A Sandwich-Type Standard Error Estimator of SEM Models with Multivariate Time Series
Zhang, Guangjian; Chow, Sy-Miin; Ong, Anthony D.
2011-01-01
Structural equation models are increasingly used as a modeling tool for multivariate time series data in the social and behavioral sciences. Standard error estimators of SEM models, originally developed for independent data, require modifications to accommodate the fact that time series data are inherently dependent. In this article, we extend a…
Huitsing, Gijs; van Duijn, Marijtje; Snijders, Thomas; Wang, P.; Sainio, Miia; Salmivalli, Christina; Veenstra, René
2012-01-01
Three relations between elementary school children were investigated: networks of general dislike and bullying were related to networks of general like. These were modeled using multivariate cross-sectional (statistical) network models. Exponential random graph models for a sample of 18 classrooms,
Exploring the potential of multivariate depth-damage and rainfall-damage models
DEFF Research Database (Denmark)
van Ootegem, Luc; van Herck, K.; Creten, T.
2017-01-01
In Europe, floods are among the natural catastrophes that cause the largest economic damage. This article explores the potential of two distinct types of multivariate flood damage models: ‘depth-damage’ models and ‘rainfall-damage’ models. We use survey data of 346 Flemish households that were vi...
Modeling a multivariable reactor and on-line model predictive control.
Yu, D W; Yu, D L
2005-10-01
A nonlinear first principle model is developed for a laboratory-scaled multivariable chemical reactor rig in this paper and the on-line model predictive control (MPC) is implemented to the rig. The reactor has three variables-temperature, pH, and dissolved oxygen with nonlinear dynamics-and is therefore used as a pilot system for the biochemical industry. A nonlinear discrete-time model is derived for each of the three output variables and their model parameters are estimated from the real data using an adaptive optimization method. The developed model is used in a nonlinear MPC scheme. An accurate multistep-ahead prediction is obtained for MPC, where the extended Kalman filter is used to estimate system unknown states. The on-line control is implemented and a satisfactory tracking performance is achieved. The MPC is compared with three decentralized PID controllers and the advantage of the nonlinear MPC over the PID is clearly shown.
Modeling the relationship between climate oscillations and drought by a multivariate GARCH model
Modarres, R.; Ouarda, T. B. M. J.
2014-01-01
Typical multivariate time series models may exhibit comovement in mean but not in variance of hydrologic and climatic variables. This paper introduces multivariate generalized autoregressive conditional heteroscedasticity (GARCH) models to capture the comovement of the variance or the conditional covariance between two hydroclimatic time series. The diagonal vectorized and Baba-Engle-Kroft-Kroner models are developed to evaluate the covariance between drought and two atmospheric circulations, Southern Oscillation Index (SOI) and North Atlantic Oscillation (NAO) time series during 1954-2000. The univariate generalized autoregressive conditional heteroscedasticity model indicates a strong persistency level in conditional variance for NAO and a moderate persistency level for SOI. The conditional variance of short-term drought index indicates low level of persistency, while the long-term index drought indicates high level of persistency in conditional variance. The estimated conditional covariance between drought and atmospheric indices is shown to be weak and negative. It is also observed that the covariance between drought and atmospheric indices is largely dependent on short-run variance of atmospheric indices rather than their long-run variance. The nonlinearity and stationarity tests show that the conditional covariances are nonlinear but stationary. However, the degree of nonlinearity is higher for the covariance between long-term drought and atmospheric indices. It is also observed that the nonlinearity of NAO is higher than that for SOI, in contrast to the stationarity which is stronger for SOI time series.
Univariate and multivariate general linear models theory and applications with SAS
Kim, Kevin
2006-01-01
Reviewing the theory of the general linear model (GLM) using a general framework, Univariate and Multivariate General Linear Models: Theory and Applications with SAS, Second Edition presents analyses of simple and complex models, both univariate and multivariate, that employ data sets from a variety of disciplines, such as the social and behavioral sciences.With revised examples that include options available using SAS 9.0, this expanded edition divides theory from applications within each chapter. Following an overview of the GLM, the book introduces unrestricted GLMs to analyze multiple regr
Multivariate Autoregressive Modeling and Granger Causality Analysis of Multiple Spike Trains
Directory of Open Access Journals (Sweden)
Michael Krumin
2010-01-01
Full Text Available Recent years have seen the emergence of microelectrode arrays and optical methods allowing simultaneous recording of spiking activity from populations of neurons in various parts of the nervous system. The analysis of multiple neural spike train data could benefit significantly from existing methods for multivariate time-series analysis which have proven to be very powerful in the modeling and analysis of continuous neural signals like EEG signals. However, those methods have not generally been well adapted to point processes. Here, we use our recent results on correlation distortions in multivariate Linear-Nonlinear-Poisson spiking neuron models to derive generalized Yule-Walker-type equations for fitting ‘‘hidden’’ Multivariate Autoregressive models. We use this new framework to perform Granger causality analysis in order to extract the directed information flow pattern in networks of simulated spiking neurons. We discuss the relative merits and limitations of the new method.
Computer-Aided Decisions in Human Services: Expert Systems and Multivariate Models.
Sicoly, Fiore
1989-01-01
This comparison of two approaches to the development of computerized supports for decision making--expert systems and multivariate models--focuses on computerized systems that assist professionals with tasks related to diagnosis or classification in human services. Validation of both expert systems and statistical models is emphasized. (39…
A Multivariate Model for the Meta-Analysis of Study Level Survival Data at Multiple Times
Jackson, Dan; Rollins, Katie; Coughlin, Patrick
2014-01-01
Motivated by our meta-analytic dataset involving survival rates after treatment for critical leg ischemia, we develop and apply a new multivariate model for the meta-analysis of study level survival data at multiple times. Our data set involves 50 studies that provide mortality rates at up to seven time points, which we model simultaneously, and…
DEFF Research Database (Denmark)
Ørregård Nielsen, Morten
2015-01-01
This article proves consistency and asymptotic normality for the conditional-sum-of-squares estimator, which is equivalent to the conditional maximum likelihood estimator, in multivariate fractional time-series models. The model is parametric and quite general and, in particular, encompasses...
DEFF Research Database (Denmark)
Ørregård Nielsen, Morten
This paper proves consistency and asymptotic normality for the conditional-sum-of-squares estimator, which is equivalent to the conditional maximum likelihood estimator, in multivariate fractional time series models. The model is parametric and quite general, and, in particular, encompasses...
Meta-Analytic Structural Equation Modeling (MASEM): Comparison of the Multivariate Methods
Zhang, Ying
2011-01-01
Meta-analytic Structural Equation Modeling (MASEM) has drawn interest from many researchers recently. In doing MASEM, researchers usually first synthesize correlation matrices across studies using meta-analysis techniques and then analyze the pooled correlation matrix using structural equation modeling techniques. Several multivariate methods of…
Multivariate nonlinear time series modeling of exposure and risk in road safety research
Bijleveld, F.; Commandeur, J.; Montfort, van K.; Koopman, S.J.
2010-01-01
A multivariate non-linear time series model for road safety data is presented. The model is applied in a case-study into the development of a yearly time series of numbers of fatal accidents (inside and outside urban areas) and numbers of kilometres driven by motor vehicles in the Netherlands betwee
Moons, Karel G M; Altman, Douglas G; Reitsma, Johannes B; Ioannidis, John P A; Macaskill, Petra; Steyerberg, Ewout W; Vickers, Andrew J; Ransohoff, David F; Collins, Gary S
2015-01-01
The TRIPOD (Transparent Reporting of a multivariable prediction model for Individual Prognosis Or Diagnosis) Statement includes a 22-item checklist, which aims to improve the reporting of studies developing, validating, or updating a prediction model, whether for diagnostic or prognostic purposes. T
Do We Really Need Both BEKK and DCC? A Tale of Two Multivariate GARCH Models
M. Caporin (Massimiliano); M.J. McAleer (Michael)
2010-01-01
textabstractThe management and monitoring of very large portfolios of financial assets are routine for many individuals and organizations. The two most widely used models of conditional covariances and correlations in the class of multivariate GARCH models are BEKK and DCC. It is well known that BEK
Institute of Scientific and Technical Information of China (English)
Yee LEUNG; WU Kefa; DONG Tianxin
2001-01-01
In this paper, a multivariate linear functional relationship model, where the covariance matrix of the observational errors is not restricted, is considered. The parameter estimation of this model is discussed. The estimators are shown to be a strongly consistent estimation under some mild conditions on the incidental parameters.
A multivariate random-parameters Tobit model for analyzing highway crash rates by injury severity.
Zeng, Qiang; Wen, Huiying; Huang, Helai; Pei, Xin; Wong, S C
2017-02-01
In this study, a multivariate random-parameters Tobit model is proposed for the analysis of crash rates by injury severity. In the model, both correlation across injury severity and unobserved heterogeneity across road-segment observations are accommodated. The proposed model is compared with a multivariate (fixed-parameters) Tobit model in the Bayesian context, by using a crash dataset collected from the Traffic Information System of Hong Kong. The dataset contains crash, road geometric and traffic information on 224 directional road segments for a five-year period (2002-2006). The multivariate random-parameters Tobit model provides a much better fit than its fixed-parameters counterpart, according to the deviance information criteria and Bayesian R(2), while it reveals a higher correlation between crash rates at different severity levels. The parameter estimates show that a few risk factors (bus stop, lane changing opportunity and lane width) have heterogeneous effects on crash-injury-severity rates. For the other factors, the variances of their random parameters are insignificant at the 95% credibility level, then the random parameters are set to be fixed across observations. Nevertheless, most of these fixed coefficients are estimated with higher precisions (i.e., smaller variances) in the random-parameters model. Thus, the random-parameters Tobit model, which provides a more comprehensive understanding of the factors' effects on crash rates by injury severity, is superior to the multivariate Tobit model and should be considered a good alternative for traffic safety analysis.
Identification of Civil Engineering Structures using Multivariate ARMAV and RARMAV Models
DEFF Research Database (Denmark)
Kirkegaard, Poul Henning; Andersen, P.; Brincker, Rune
This paper presents how to make system identification of civil engineering structures using multivariate auto-regressive moving-average vector (ARMAV) models. Further, the ARMAV technique is extended to a recursive technique (RARMAV). The ARMAV model is used to identify measured stationary data....... The results show the usefulness of the approaches for identification of civil engineering structures excited by natural excitation...
Multivariate modelling of endophenotypes associated with the metabolic syndrome in Chinese twins
DEFF Research Database (Denmark)
Pang, Z; Zhang, D; Li, S
2010-01-01
AIMS/HYPOTHESIS: The common genetic and environmental effects on endophenotypes related to the metabolic syndrome have been investigated using bivariate and multivariate twin models. This paper extends the pairwise analysis approach by introducing independent and common pathway models to Chinese...
Modeling inflation rates and exchange rates in Ghana: application of multivariate GARCH models.
Nortey, Ezekiel Nn; Ngoh, Delali D; Doku-Amponsah, Kwabena; Ofori-Boateng, Kenneth
2015-01-01
This paper was aimed at investigating the volatility and conditional relationship among inflation rates, exchange rates and interest rates as well as to construct a model using multivariate GARCH DCC and BEKK models using Ghana data from January 1990 to December 2013. The study revealed that the cumulative depreciation of the cedi to the US dollar from 1990 to 2013 is 7,010.2% and the yearly weighted depreciation of the cedi to the US dollar for the period is 20.4%. There was evidence that, the fact that inflation rate was stable, does not mean that exchange rates and interest rates are expected to be stable. Rather, when the cedi performs well on the forex, inflation rates and interest rates react positively and become stable in the long run. The BEKK model is robust to modelling and forecasting volatility of inflation rates, exchange rates and interest rates. The DCC model is robust to model the conditional and unconditional correlation among inflation rates, exchange rates and interest rates. The BEKK model, which forecasted high exchange rate volatility for the year 2014, is very robust for modelling the exchange rates in Ghana. The mean equation of the DCC model is also robust to forecast inflation rates in Ghana.
Development of a multivariate empirical model for predicting weak rock mass modulus
Institute of Scientific and Technical Information of China (English)
Kallu Raj R.; Keffeler Evan R.; Watters Robert J.; Agharazi Alireza
2015-01-01
Estimating weak rock mass modulus has historically proven difficult although this mechanical property is an important input to many types of geotechnical analyses. An empirical database of weak rock mass modulus with associated detailed geotechnical parameters was assembled from plate loading tests per-formed at underground mines in Nevada, the Bakhtiary Dam project, and Portugues Dam project. The database was used to assess the accuracy of published single-variate models and to develop a multivari-ate model for predicting in-situ weak rock mass modulus when limited geotechnical data are available. Only two of the published models were adequate for predicting modulus of weak rock masses over lim-ited ranges of alteration intensities, and none of the models provided good estimates of modulus over a range of geotechnical properties. In light of this shortcoming, a multivariate model was developed from the weak rock mass modulus dataset, and the new model is exponential in form and has the following independent variables:(1) average block size or joint spacing, (2) field estimated rock strength, (3) dis-continuity roughness, and (4) discontinuity infilling hardness. The multivariate model provided better estimates of modulus for both hard-blocky rock masses and intensely-altered rock masses.
Chen, Xiaohong; Fan, Yanqin; Pouzo, Demian; Ying, Zhiliang
2010-07-01
We study estimation and model selection of semiparametric models of multivariate survival functions for censored data, which are characterized by possibly misspecified parametric copulas and nonparametric marginal survivals. We obtain the consistency and root-n asymptotic normality of a two-step copula estimator to the pseudo-true copula parameter value according to KLIC, and provide a simple consistent estimator of its asymptotic variance, allowing for a first-step nonparametric estimation of the marginal survivals. We establish the asymptotic distribution of the penalized pseudo-likelihood ratio statistic for comparing multiple semiparametric multivariate survival functions subject to copula misspecification and general censorship. An empirical application is provided.
Multivariable Linear Regression Model for Promotional Forecasting:The Coca Cola - Morrisons Case
Zheng, Yiwei/Y
2009-01-01
This paper describes a promotional forecasting model, built by linear regression module in Microsoft Excel. It intends to provide quick and reliable forecasts with a moderate credit and to assist the CPFR between the Coca Cola Enterprises (CCE) and the Morrisons. The model is derived from previous researches and literature review on CPFR, promotion, forecasting and modelling. It is designed as a multivariable linear regression model, which involves several promotional mix as variables includi...
Forecasting Multivariate Volatility using the VARFIMA Model on Realized Covariance Cholesky Factors
DEFF Research Database (Denmark)
Halbleib, Roxana; Voev, Valeri
2011-01-01
This paper analyzes the forecast accuracy of the multivariate realized volatility model introduced by Chiriac and Voev (2010), subject to different degrees of model parametrization and economic evaluation criteria. Bymodelling the Cholesky factors of the covariancematrices, the model generates...... positive definite, but biased covariance forecasts. In this paper, we provide empirical evidence that parsimonious versions of the model generate the best covariance forecasts in the absence of bias correction. Moreover, we show by means of stochastic dominance tests that any risk averse investor...
The multivariate beta process and an extension of the Polya tree model.
Trippa, Lorenzo; Müller, Peter; Johnson, Wesley
2011-03-01
We introduce a novel stochastic process that we term the multivariate beta process. The process is defined for modelling-dependent random probabilities and has beta marginal distributions. We use this process to define a probability model for a family of unknown distributions indexed by covariates. The marginal model for each distribution is a Polya tree prior. An important feature of the proposed prior is the easy centring of the nonparametric model around any parametric regression model. We use the model to implement nonparametric inference for survival distributions. The nonparametric model that we introduce can be adopted to extend the support of prior distributions for parametric regression models.
Multivariate poisson-lognormal model for modeling related factors in crash frequency by severity
Directory of Open Access Journals (Sweden)
Mehdi Tazhibi
2013-01-01
Full Text Available Aims: Traditionally, roadway safety analyses have used univariate distributions to model crash data for each level of severity separately. This paper uses the multivariate Poisson lognormal (MVPLN models to estimate the expected crash frequency by two levels of severity and then compares those estimates with the univariate Poisson-lognormal (UVPLN and the univariate Poisson (UVP models. Materials and Methods: The parameters estimation is done by Bayesian method for crash data at two levels of severity at the intersection of Isfahan city for 6 months. Results: The results showed that there was over-dispersion issue in data. The UVP model is not able to overcome this problem while the MVPLN model can account for over-dispersion. Also, the estimates of the extra Poisson variation parameters in the MVPLN model were smaller than the UVPLN model that causes improvement in the precision of the MNPLN model. Hence, the MVPLN model is better fitted to the data set. Also, results showed effect of the total Average annual daily traffic (AADT on the property damage only crash was significant in the all of models but effect of the total left turn AADT on the injuries and fatalities crash was significant just in the UVP model. Hence, holding all other factors fixed more property damage only crashes were expected on more the total AADT. For example, under MVPLN model an increase of 1000 vehicles in (average the total AADT was predicted to result in 31% more property damage only crash. Conclusion: Hence, reduction of total AADT was predicted to be highly cost-effective, in terms of the crash cost reductions over the long run.
Shen, Yanna; Cooper, Gregory F
2012-09-01
This paper investigates Bayesian modeling of known and unknown causes of events in the context of disease-outbreak detection. We introduce a multivariate Bayesian approach that models multiple evidential features of every person in the population. This approach models and detects (1) known diseases (e.g., influenza and anthrax) by using informative prior probabilities and (2) unknown diseases (e.g., a new, highly contagious respiratory virus that has never been seen before) by using relatively non-informative prior probabilities. We report the results of simulation experiments which support that this modeling method can improve the detection of new disease outbreaks in a population. A contribution of this paper is that it introduces a multivariate Bayesian approach for jointly modeling both known and unknown causes of events. Such modeling has general applicability in domains where the space of known causes is incomplete.
Analyzing Multiple Multivariate Time Series Data Using Multilevel Dynamic Factor Models.
Song, Hairong; Zhang, Zhiyong
2014-01-01
Multivariate time series data offer researchers opportunities to study dynamics of various systems in social and behavioral sciences. Dynamic factor model (DFM), as an idiographic approach for studying intraindividual variability and dynamics, has typically been applied to time series data obtained from a single unit. When multivariate time series data are collected from multiple units, how to synchronize dynamical information becomes a silent issue. To address this issue, the current study presented a multilevel dynamic factor model (MDFM) that analyzes multiple multivariate time series in multilevel SEM frameworks. MDFM not only disentangles within- and between-person variability but also models dynamics of the intraindividual processes. To illustrate the uses of MDFMs, we applied lag0, lag1, and lag2 MDFMs to empirical data on affect collected from 205 dating couples who had at least 50 consecutive days of observations. We also considered a model extension where the dynamical coefficients were allowed to be randomly varying in the population. The empirical analysis yielded interesting findings regarding affect regulation and coregulation within couples, demonstrating promising uses of MDFMs in analyzing multiple multivariate time series. In the end, we discussed a number of methodological issues in the applications of MDFMs and pointed out possible directions for future research.
Molenaar, P.C.M.
1987-01-01
Outlines a frequency domain analysis of the dynamic factor model and proposes a solution to the problem of constructing a causal filter of lagged factor loadings. The method is illustrated with applications to simulated and real multivariate time series. The latter applications involve topographic a
DMU - A Package for Analyzing Multivariate Mixed Models in quantitative Genetics and Genomics
DEFF Research Database (Denmark)
Madsen, Per; Jensen, Just; Labouriau, Rodrigo;
The DMU-package for Analyzing Multivariate Mixed Models has been developed over a period of more than 25 years. This paper gives an overview of new features and the recent developments around the DMU-package, including: Genomic prediction (SNP-BLUP, G-BLUP and “One-Step”), Genome-wide association...
Directory of Open Access Journals (Sweden)
Liu Gang
2009-01-01
Full Text Available By using the methods of linear algebra and matrix inequality theory, we obtain the characterization of admissible estimators in the general multivariate linear model with respect to inequality restricted parameter set. In the classes of homogeneous and general linear estimators, the necessary and suffcient conditions that the estimators of regression coeffcient function are admissible are established.
First principles dynamic modeling and multivariable control of a cryogenic distillation process
Roffel, B.; Betlem, B.H.L.; Ruijter, J.A.
2000-01-01
In order to investigate the feasibility of constrained multivariable control of a heat-integrated cryogenic distillation process, a rigorous first principles dynamic model was developed and tested against a limited number of experiments. It was found that the process variables showed a large amount
Modeling of turbulent supersonic H2-air combustion with a multivariate beta PDF
Baurle, R. A.; Hassan, H. A.
1993-01-01
Recent calculations of turbulent supersonic reacting shear flows using an assumed multivariate beta PDF (probability density function) resulted in reduced production rates and a delay in the onset of combustion. This result is not consistent with available measurements. The present research explores two possible reasons for this behavior: use of PDF's that do not yield Favre averaged quantities, and the gradient diffusion assumption. A new multivariate beta PDF involving species densities is introduced which makes it possible to compute Favre averaged mass fractions. However, using this PDF did not improve comparisons with experiment. A countergradient diffusion model is then introduced. Preliminary calculations suggest this to be the cause of the discrepancy.
Multivariate Radiological-Based Models for the Prediction of Future Knee Pain: Data from the OAI
Directory of Open Access Journals (Sweden)
Jorge I. Galván-Tejada
2015-01-01
Full Text Available In this work, the potential of X-ray based multivariate prognostic models to predict the onset of chronic knee pain is presented. Using X-rays quantitative image assessments of joint-space-width (JSW and paired semiquantitative central X-ray scores from the Osteoarthritis Initiative (OAI, a case-control study is presented. The pain assessments of the right knee at the baseline and the 60-month visits were used to screen for case/control subjects. Scores were analyzed at the time of pain incidence (T-0, the year prior incidence (T-1, and two years before pain incidence (T-2. Multivariate models were created by a cross validated elastic-net regularized generalized linear models feature selection tool. Univariate differences between cases and controls were reported by AUC, C-statistics, and ODDs ratios. Univariate analysis indicated that the medial osteophytes were significantly more prevalent in cases than controls: C-stat 0.62, 0.62, and 0.61, at T-0, T-1, and T-2, respectively. The multivariate JSW models significantly predicted pain: AUC = 0.695, 0.623, and 0.620, at T-0, T-1, and T-2, respectively. Semiquantitative multivariate models predicted paint with C-stat = 0.671, 0.648, and 0.645 at T-0, T-1, and T-2, respectively. Multivariate models derived from plain X-ray radiography assessments may be used to predict subjects that are at risk of developing knee pain.
Probabilistic, Multivariable Flood Loss Modeling on the Mesoscale with BT-FLEMO.
Kreibich, Heidi; Botto, Anna; Merz, Bruno; Schröter, Kai
2017-04-01
Flood loss modeling is an important component for risk analyses and decision support in flood risk management. Commonly, flood loss models describe complex damaging processes by simple, deterministic approaches like depth-damage functions and are associated with large uncertainty. To improve flood loss estimation and to provide quantitative information about the uncertainty associated with loss modeling, a probabilistic, multivariable Bagging decision Tree Flood Loss Estimation MOdel (BT-FLEMO) for residential buildings was developed. The application of BT-FLEMO provides a probability distribution of estimated losses to residential buildings per municipality. BT-FLEMO was applied and validated at the mesoscale in 19 municipalities that were affected during the 2002 flood by the River Mulde in Saxony, Germany. Validation was undertaken on the one hand via a comparison with six deterministic loss models, including both depth-damage functions and multivariable models. On the other hand, the results were compared with official loss data. BT-FLEMO outperforms deterministic, univariable, and multivariable models with regard to model accuracy, although the prediction uncertainty remains high. An important advantage of BT-FLEMO is the quantification of prediction uncertainty. The probability distribution of loss estimates by BT-FLEMO well represents the variation range of loss estimates of the other models in the case study. © 2016 Society for Risk Analysis.
Multivariable Model for Time to First Treatment in Patients With Chronic Lymphocytic Leukemia
Wierda, William G.; O'Brien, Susan; Wang, Xuemei; Faderl, Stefan; Ferrajoli, Alessandra; Do, Kim-Anh; Garcia-Manero, Guillermo; Cortes, Jorge; Thomas, Deborah; Koller, Charles A.; Burger, Jan A.; Lerner, Susan; Schlette, Ellen; Abruzzo, Lynne; Kantarjian, Hagop M.; Keating, Michael J.
2011-01-01
Purpose The clinical course for patients with chronic lymphocytic leukemia (CLL) is diverse; some patients have indolent disease, never needing treatment, whereas others have aggressive disease requiring early treatment. We continue to use criteria for active disease to initiate therapy. Multivariable analysis was performed to identify prognostic factors independently associated with time to first treatment for patients with CLL. Patients and Methods Traditional laboratory, clinical prognostic, and newer prognostic factors such as fluorescent in situ hybridization (FISH), IGHV mutation status, and ZAP-70 expression evaluated at first patient visit to MD Anderson Cancer Center were correlated by multivariable analysis with time to first treatment. This multivariable model was used to develop a nomogram—a weighted tool to calculate 2- and 4-year probability of treatment and estimate median time to first treatment. Results There were 930 previously untreated patients who had traditional and new prognostic factors evaluated; they did not have active CLL requiring initiation of treatment within 3 months of first visit and were observed for time to first treatment. The following were independently associated with shorter time to first treatment: three involved lymph node sites, increased size of cervical lymph nodes, presence of 17p deletion or 11q deletion by FISH, increased serum lactate dehydrogenase, and unmutated IGHV mutation status. Conclusion We developed a multivariable model that incorporates traditional and newer prognostic factors to identify patients at high risk for progression to treatment. This model may be useful to identify patients for early interventional trials. PMID:21969505
Multivariate time series modeling of short-term system scale irrigation demand
Perera, Kushan C.; Western, Andrew W.; George, Biju; Nawarathna, Bandara
2015-12-01
Travel time limits the ability of irrigation system operators to react to short-term irrigation demand fluctuations that result from variations in weather, including very hot periods and rainfall events, as well as the various other pressures and opportunities that farmers face. Short-term system-wide irrigation demand forecasts can assist in system operation. Here we developed a multivariate time series (ARMAX) model to forecast irrigation demands with respect to aggregated service points flows (IDCGi, ASP) and off take regulator flows (IDCGi, OTR) based across 5 command areas, which included area covered under four irrigation channels and the study area. These command area specific ARMAX models forecast 1-5 days ahead daily IDCGi, ASP and IDCGi, OTR using the real time flow data recorded at the service points and the uppermost regulators and observed meteorological data collected from automatic weather stations. The model efficiency and the predictive performance were quantified using the root mean squared error (RMSE), Nash-Sutcliffe model efficiency coefficient (NSE), anomaly correlation coefficient (ACC) and mean square skill score (MSSS). During the evaluation period, NSE for IDCGi, ASP and IDCGi, OTR across 5 command areas were ranged 0.98-0.78. These models were capable of generating skillful forecasts (MSSS ⩾ 0.5 and ACC ⩾ 0.6) of IDCGi, ASP and IDCGi, OTR for all 5 lead days and IDCGi, ASP and IDCGi, OTR forecasts were better than using the long term monthly mean irrigation demand. Overall these predictive performance from the ARMAX time series models were higher than almost all the previous studies we are aware. Further, IDCGi, ASP and IDCGi, OTR forecasts have improved the operators' ability to react for near future irrigation demand fluctuations as the developed ARMAX time series models were self-adaptive to reflect the short-term changes in the irrigation demand with respect to various pressures and opportunities that farmers' face, such as
Directory of Open Access Journals (Sweden)
Abdul Wahid
2016-02-01
Full Text Available Distillation columns are widely used in chemical industry as unit operation and required advance process control because it has multi input multi output (MIMO or multi-variable system, which is hard to be controlled. Model predictive control (MPC is one of alternative controller developed for MIMO system due to loops interaction to be controlled. This study aimed to obtain dynamic model of process control on a distillation column using MPC, and to get the optimum performance of MPC controller. Process control in distillation columns performed by simulating the dynamic models of distillation columns by UNISIM R390.1 software. The optimization process was carried out by tuning the MPC controller parameters such as sampling time (Ts = 1 – 240 s, prediction horizon (P = 1-400, and the control horizon (M=1-400. The comparison between the performance of MPC and PI controller is presented and Integral Absolut Error (IAE was used as comparison parameter. The results indicate that the performance of MPC was better than PI controller for set point change 0.95 to 0.94 on distillate product composition using a modified model 1 with IAE 0.0584 for MPC controller and 0.0782 for PI controller.
Coelho, Antonio Augusto Rodrigues
2016-01-01
This paper introduces the Fuzzy Logic Hypercube Interpolator (FLHI) and demonstrates applications in control of multiple-input single-output (MISO) and multiple-input multiple-output (MIMO) processes with Hammerstein nonlinearities. FLHI consists of a Takagi-Sugeno fuzzy inference system where membership functions act as kernel functions of an interpolator. Conjunction of membership functions in an unitary hypercube space enables multivariable interpolation of N-dimensions. Membership functions act as interpolation kernels, such that choice of membership functions determines interpolation characteristics, allowing FLHI to behave as a nearest-neighbor, linear, cubic, spline or Lanczos interpolator, to name a few. The proposed interpolator is presented as a solution to the modeling problem of static nonlinearities since it is capable of modeling both a function and its inverse function. Three study cases from literature are presented, a single-input single-output (SISO) system, a MISO and a MIMO system. Good results are obtained regarding performance metrics such as set-point tracking, control variation and robustness. Results demonstrate applicability of the proposed method in modeling Hammerstein nonlinearities and their inverse functions for implementation of an output compensator with Model Based Predictive Control (MBPC), in particular Dynamic Matrix Control (DMC). PMID:27657723
A multivariate linear regression model for the Jordanian industrial electric energy consumption
Energy Technology Data Exchange (ETDEWEB)
Al-Ghandoor, A.; Nahleh, Y.A.; Sandouqa, Y.; Al-Salaymeh, M. [Hashemite Univ., Zarqa (Jordan). Dept. of Industrial Engineering
2007-08-09
The amount of electricity used by the industrial sector in Jordan is an important driver for determining the future energy needs of the country. This paper proposed a model to simulate electricity and energy consumption by industry. The general model approach was based on multivariate regression analysis to provide valuable information regarding energy demands and analysis, and to identify the various factors that influence Jordanian industrial electricity consumption. It was determined that industrial gross output and capacity utilization are the most important variables that drive electricity consumption. The results revealed that the multivariate linear regression model can be used to adequately model the Jordanian industrial electricity consumption with coefficient of determination (R2) and adjusted R2 values of 99.3 and 99.2 per cent, respectively. 19 refs., 4 tabs., 2 figs.
Multivariate Calibration Models for Sorghum Composition using Near-Infrared Spectroscopy
Energy Technology Data Exchange (ETDEWEB)
Wolfrum, E.; Payne, C.; Stefaniak, T.; Rooney, W.; Dighe, N.; Bean, B.; Dahlberg, J.
2013-03-01
NREL developed calibration models based on near-infrared (NIR) spectroscopy coupled with multivariate statistics to predict compositional properties relevant to cellulosic biofuels production for a variety of sorghum cultivars. A robust calibration population was developed in an iterative fashion. The quality of models developed using the same sample geometry on two different types of NIR spectrometers and two different sample geometries on the same spectrometer did not vary greatly.
Improvement of a Robotic Manipulator Model Based on Multivariate Residual Modeling
Directory of Open Access Journals (Sweden)
Serge Gale
2017-07-01
Full Text Available A new method is presented for extending a dynamic model of a six degrees of freedom robotic manipulator. A non-linear multivariate calibration of input–output training data from several typical motion trajectories is carried out with the aim of predicting the model systematic output error at time (t + 1 from known input reference up till and including time (t. A new partial least squares regression (PLSR based method, nominal PLSR with interactions was developed and used to handle, unmodelled non-linearities. The performance of the new method is compared with least squares (LS. Different cross-validation schemes were compared in order to assess the sampling of the state space based on conventional trajectories. The method developed in the paper can be used as fault monitoring mechanism and early warning system for sensor failure. The results show that the suggested methods improves trajectory tracking performance of the robotic manipulator by extending the initial dynamic model of the manipulator.
Forecasting Multivariate Volatility using the VARFIMA Model on Realized Covariance Cholesky Factors
DEFF Research Database (Denmark)
Halbleib, Roxana; Voev, Valeri
2011-01-01
This paper analyzes the forecast accuracy of the multivariate realized volatility model introduced by Chiriac and Voev (2010), subject to different degrees of model parametrization and economic evaluation criteria. Bymodelling the Cholesky factors of the covariancematrices, the model generates...... positive definite, but biased covariance forecasts. In this paper, we provide empirical evidence that parsimonious versions of the model generate the best covariance forecasts in the absence of bias correction. Moreover, we show by means of stochastic dominance tests that any risk averse investor......, regardless of the type of utility function or return distribution, would be better-off from using this model than from using some standard approaches....
An empirical approach to update multivariate regression models intended for routine industrial use
Energy Technology Data Exchange (ETDEWEB)
Garcia-Mencia, M.V.; Andrade, J.M.; Lopez-Mahia, P.; Prada, D. [University of La Coruna, La Coruna (Spain). Dept. of Analytical Chemistry
2000-11-01
Many problems currently tackled by analysts are highly complex and, accordingly, multivariate regression models need to be developed. Two intertwined topics are important when such models are to be applied within the industrial routines: (1) Did the model account for the 'natural' variance of the production samples? (2) Is the model stable on time? This paper focuses on the second topic and it presents an empirical approach where predictive models developed by using Mid-FTIR and PLS and PCR hold its utility during about nine months when used to predict the octane number of platforming naphthas in a petrochemical refinery. 41 refs., 10 figs., 1 tab.
POWERLIB: SAS/IML Software for Computing Power in Multivariate Linear Models
Directory of Open Access Journals (Sweden)
Jacqueline L. Johnson
2009-04-01
Full Text Available The POWERLIB SAS/IML software provides convenient power calculations for a widerange of multivariate linear models with Gaussian errors. The software includes the Box,Geisser-Greenhouse, Huynh-Feldt, and uncorrected tests in the univariate" approach torepeated measures (UNIREP, the Hotelling Lawley Trace, Pillai-Bartlett Trace, andWilks Lambda tests in multivariate" approach (MULTIREP, as well as a limited butuseful range of mixed models. The familiar univariate linear model with Gaussian errorsis an important special case. For estimated covariance, the software provides condencelimits for the resulting estimated power. All power and condence limits values canbe output to a SAS dataset, which can be used to easily produce plots and tables formanuscripts.
Joint multivariate statistical model and its applications to the synthetic earthquake prediction
Institute of Scientific and Technical Information of China (English)
韩天锡; 蒋淳; 魏雪丽; 韩梅; 冯德益
2004-01-01
Considering the problems that should be solved in the synthetic earthquake prediction at present, a new model is proposed in the paper. It is called joint multivariate statistical model combined by principal component analysis with discriminatory analysis. Principal component analysis and discriminatory analysis are very important theories in multivariate statistical analysis that has developed quickly in the late thirty years. By means of maximization information method, we choose several earthquake prediction factors whose cumulative proportions of total sample variances are beyond 90% from numerous earthquake prediction factors. The paper applies regression analysis and Mahalanobis discrimination to extrapolating synthetic prediction. Furthermore, we use this model to characterize and predict earthquakes in North China (30°～42°N, 108°～125°E) and better prediction results are obtained.
Kim, Jieun; Zhu, Wei; Chang, Linda; Bentler, Peter M; Ernst, Thomas
2007-02-01
The ultimate goal of brain connectivity studies is to propose, test, modify, and compare certain directional brain pathways. Path analysis or structural equation modeling (SEM) is an ideal statistical method for such studies. In this work, we propose a two-stage unified SEM plus GLM (General Linear Model) approach for the analysis of multisubject, multivariate functional magnetic resonance imaging (fMRI) time series data with subject-level covariates. In Stage 1, we analyze the fMRI multivariate time series for each subject individually via a unified SEM model by combining longitudinal pathways represented by a multivariate autoregressive (MAR) model, and contemporaneous pathways represented by a conventional SEM. In Stage 2, the resulting subject-level path coefficients are merged with subject-level covariates such as gender, age, IQ, etc., to examine the impact of these covariates on effective connectivity via a GLM. Our approach is exemplified via the analysis of an fMRI visual attention experiment. Furthermore, the significant path network from the unified SEM analysis is compared to that from a conventional SEM analysis without incorporating the longitudinal information as well as that from a Dynamic Causal Modeling (DCM) approach.
Ghanate, A D; Kothiwale, S; Singh, S P; Bertrand, Dominique; Krishna, C Murali
2011-02-01
Cancer is now recognized as one of the major causes of morbidity and mortality. Histopathological diagnosis, the gold standard, is shown to be subjective, time consuming, prone to interobserver disagreement, and often fails to predict prognosis. Optical spectroscopic methods are being contemplated as adjuncts or alternatives to conventional cancer diagnostics. The most important aspect of these approaches is their objectivity, and multivariate statistical tools play a major role in realizing it. However, rigorous evaluation of the robustness of spectral models is a prerequisite. The utility of Raman spectroscopy in the diagnosis of cancers has been well established. Until now, the specificity and applicability of spectral models have been evaluated for specific cancer types. In this study, we have evaluated the utility of spectroscopic models representing normal and malignant tissues of the breast, cervix, colon, larynx, and oral cavity in a broader perspective, using different multivariate tests. The limit test, which was used in our earlier study, gave high sensitivity but suffered from poor specificity. The performance of other methods such as factorial discriminant analysis and partial least square discriminant analysis are at par with more complex nonlinear methods such as decision trees, but they provide very little information about the classification model. This comparative study thus demonstrates not just the efficacy of Raman spectroscopic models but also the applicability and limitations of different multivariate tools for discrimination under complex conditions such as the multicancer scenario.
Ghanate, A. D.; Kothiwale, S.; Singh, S. P.; Bertrand, Dominique; Krishna, C. Murali
2011-02-01
Cancer is now recognized as one of the major causes of morbidity and mortality. Histopathological diagnosis, the gold standard, is shown to be subjective, time consuming, prone to interobserver disagreement, and often fails to predict prognosis. Optical spectroscopic methods are being contemplated as adjuncts or alternatives to conventional cancer diagnostics. The most important aspect of these approaches is their objectivity, and multivariate statistical tools play a major role in realizing it. However, rigorous evaluation of the robustness of spectral models is a prerequisite. The utility of Raman spectroscopy in the diagnosis of cancers has been well established. Until now, the specificity and applicability of spectral models have been evaluated for specific cancer types. In this study, we have evaluated the utility of spectroscopic models representing normal and malignant tissues of the breast, cervix, colon, larynx, and oral cavity in a broader perspective, using different multivariate tests. The limit test, which was used in our earlier study, gave high sensitivity but suffered from poor specificity. The performance of other methods such as factorial discriminant analysis and partial least square discriminant analysis are at par with more complex nonlinear methods such as decision trees, but they provide very little information about the classification model. This comparative study thus demonstrates not just the efficacy of Raman spectroscopic models but also the applicability and limitations of different multivariate tools for discrimination under complex conditions such as the multicancer scenario.
Directory of Open Access Journals (Sweden)
Eva Fišerová
2014-06-01
Full Text Available The paper is focused on the decomposition of mixed partitioned multivariate models into two seemingly unrelated submodels in order to obtain more efficient estimators. The multiresponses are independently normally distributed with the same covariance matrix. The partitioned multivariate model is considered either with, or without an intercept. The elimination transformation of the intercept that preserves the BLUEs of parameter matri- ces and the MINQUE of the variance components in multivariate models with and without an intercept is stated. Procedures on testing the decomposition of the partitioned model are presented. The properties of plug-in test statistics as functions of variance compo- nents are investigated by sensitivity analysis and insensitivity regions for the significance level are proposed. The insensitivity region is a safe region in the parameter space of the variance components where the approximation of the variance components can be used without any essential deterioration of the significance level of the plug-in test statistic. The behavior of plug-in test statistics and insensitivity regions is studied by simulations.
Su, Liyun; Zhao, Yanyong; Yan, Tianshun; Li, Fenglan
2012-01-01
Multivariate local polynomial fitting is applied to the multivariate linear heteroscedastic regression model. Firstly, the local polynomial fitting is applied to estimate heteroscedastic function, then the coefficients of regression model are obtained by using generalized least squares method. One noteworthy feature of our approach is that we avoid the testing for heteroscedasticity by improving the traditional two-stage method. Due to non-parametric technique of local polynomial estimation, it is unnecessary to know the form of heteroscedastic function. Therefore, we can improve the estimation precision, when the heteroscedastic function is unknown. Furthermore, we verify that the regression coefficients is asymptotic normal based on numerical simulations and normal Q-Q plots of residuals. Finally, the simulation results and the local polynomial estimation of real data indicate that our approach is surely effective in finite-sample situations.
Tescione, Lia; Lambropoulos, James; Paranandi, Madhava Ram; Makagiansar, Helena; Ryll, Thomas
2015-01-01
A bench scale cell culture model representative of manufacturing scale (2,000 L) was developed based on oxygen mass transfer principles, for a CHO-based process producing a recombinant human protein. Cell culture performance differences across scales are characterized most often by sub-optimal performance in manufacturing scale bioreactors. By contrast in this study, reduced growth rates were observed at bench scale during the initial model development. Bioreactor models based on power per unit volume (P/V), volumetric mass transfer coefficient (kL a), and oxygen transfer rate (OTR) were evaluated to address this scale performance difference. Lower viable cell densities observed for the P/V model were attributed to higher sparge rates and reduced oxygen mass transfer efficiency (kL a) of the small scale hole spargers. Increasing the sparger kL a by decreasing the pore size resulted in a further decrease in growth at bench scale. Due to sensitivity of the cell line to gas sparge rate and bubble size that was revealed by the P/V and kL a models, an OTR model based on oxygen enrichment and increased P/V was selected that generated endpoint sparge rates representative of 2,000 L scale. This final bench scale model generated similar growth rates as manufacturing. In order to take into account other routinely monitored process parameters besides growth, a multivariate statistical approach was applied to demonstrate validity of the small scale model. After the model was selected based on univariate and multivariate analysis, product quality was generated and verified to fall within the 95% confidence limit of the multivariate model.
McKinney, Cliff; Renk, Kimberly
2008-01-01
Although parent-adolescent interactions have been examined, relevant variables have not been integrated into a multivariate model. As a result, this study examined a multivariate model of parent-late adolescent gender dyads in an attempt to capture important predictors in late adolescents' important and unique transition to adulthood. The sample…
Ding, Lili; Kurowski, Brad G; He, Hua; Alexander, Eileen S.; Mersha, Tesfaye B.; Fardo, David W.; Zhang, Xue; Pilipenko, Valentina V; Kottyan, Leah; Martin, Lisa J.
2014-01-01
Genetic studies often collect data on multiple traits. Most genetic association analyses, however, consider traits separately and ignore potential correlation among traits, partially because of difficulties in statistical modeling of multivariate outcomes. When multiple traits are measured in a pedigree longitudinally, additional challenges arise because in addition to correlation between traits, a trait is often correlated with its own measures over time and with measurements of other family...
Wang, Wan-Lun; Lin, Tsung-I
2014-07-30
The multivariate nonlinear mixed-effects model (MNLMM) has emerged as an effective tool for modeling multi-outcome longitudinal data following nonlinear growth patterns. In the framework of MNLMM, the random effects and within-subject errors are assumed to be normally distributed for mathematical tractability and computational simplicity. However, a serious departure from normality may cause lack of robustness and subsequently make invalid inference. This paper presents a robust extension of the MNLMM by considering a joint multivariate t distribution for the random effects and within-subject errors, called the multivariate t nonlinear mixed-effects model. Moreover, a damped exponential correlation structure is employed to capture the extra serial correlation among irregularly observed multiple repeated measures. An efficient expectation conditional maximization algorithm coupled with the first-order Taylor approximation is developed for maximizing the complete pseudo-data likelihood function. The techniques for the estimation of random effects, imputation of missing responses and identification of potential outliers are also investigated. The methodology is motivated by a real data example on 161 pregnant women coming from a study in a private fertilization obstetrics clinic in Santiago, Chile and used to analyze these data.
Multivariable Adaptive Controller for the Nonlinear MIMO Model of a Container Ship
Directory of Open Access Journals (Sweden)
Michal Brasel
2014-03-01
Full Text Available The paper presents an adaptive multivariable control system for a Multi-Input, Multi-Output (MIMO nonlinear dynamic process. The problems under study are exemplified by a synthesis of a course angle and forward speed control system for the nonlinear four-Degrees-of-Freedom (4-DoF mathematical model of a single-screw, high-speed container ship. The paper presents the complexity of the assumed model to be analyzed and a synthesis method for the multivariable adaptive modal controller. Due to a strongly nonlinear nature of the ship movements equations a multivariable adaptive controller is tuned in relation to changeable hydrodynamic operating conditions of the ship. In accordance with the given operating conditions controller parameters are chosen on the basis of four measured auxiliary signals. The system synthesis is carried out by linearization of the nonlinear model of the ship at its nominal operating points in the steady-state and by means of a pole placement control method. The final part of the paper includes results of simulation tests of the proposed control system carried out in the MATLAB/Simulink environment along with conclusions and final remarks.
Modarres, Reza; Ouarda, Taha B. M. J.; Vanasse, Alain; Orzanco, Maria Gabriela; Gosselin, Pierre
2014-07-01
Changes in extreme meteorological variables and the demographic shift towards an older population have made it important to investigate the association of climate variables and hip fracture by advanced methods in order to determine the climate variables that most affect hip fracture incidence. The nonlinear autoregressive moving average with exogenous variable-generalized autoregressive conditional heteroscedasticity (ARMA X-GARCH) and multivariate GARCH (MGARCH) time series approaches were applied to investigate the nonlinear association between hip fracture rate in female and male patients aged 40-74 and 75+ years and climate variables in the period of 1993-2004, in Montreal, Canada. The models describe 50-56 % of daily variation in hip fracture rate and identify snow depth, air temperature, day length and air pressure as the influencing variables on the time-varying mean and variance of the hip fracture rate. The conditional covariance between climate variables and hip fracture rate is increasing exponentially, showing that the effect of climate variables on hip fracture rate is most acute when rates are high and climate conditions are at their worst. In Montreal, climate variables, particularly snow depth and air temperature, appear to be important predictors of hip fracture incidence. The association of climate variables and hip fracture does not seem to change linearly with time, but increases exponentially under harsh climate conditions. The results of this study can be used to provide an adaptive climate-related public health program and ti guide allocation of services for avoiding hip fracture risk.
Energy Technology Data Exchange (ETDEWEB)
Rupšys, P. [Aleksandras Stulginskis University, Studenų g. 11, Akademija, Kaunas district, LT – 53361 Lithuania (Lithuania)
2015-10-28
A system of stochastic differential equations (SDE) with mixed-effects parameters and multivariate normal copula density function were used to develop tree height model for Scots pine trees in Lithuania. A two-step maximum likelihood parameter estimation method is used and computational guidelines are given. After fitting the conditional probability density functions to outside bark diameter at breast height, and total tree height, a bivariate normal copula distribution model was constructed. Predictions from the mixed-effects parameters SDE tree height model calculated during this research were compared to the regression tree height equations. The results are implemented in the symbolic computational language MAPLE.
Rupšys, P.
2015-10-01
A system of stochastic differential equations (SDE) with mixed-effects parameters and multivariate normal copula density function were used to develop tree height model for Scots pine trees in Lithuania. A two-step maximum likelihood parameter estimation method is used and computational guidelines are given. After fitting the conditional probability density functions to outside bark diameter at breast height, and total tree height, a bivariate normal copula distribution model was constructed. Predictions from the mixed-effects parameters SDE tree height model calculated during this research were compared to the regression tree height equations. The results are implemented in the symbolic computational language MAPLE.
Do We Really Need Both BEKK and DCC? A Tale of Two Multivariate GARCH Models
Caporin, Massimiliano; McAleer, Michael
2010-01-01
textabstractThe management and monitoring of very large portfolios of financial assets are routine for many individuals and organizations. The two most widely used models of conditional covariances and correlations in the class of multivariate GARCH models are BEKK and DCC. It is well known that BEKK suffers from the archetypal “curse of dimensionality”, whereas DCC does not. It is argued in this paper that this is a misleading interpretation of the suitability of the two models for use in pr...
Directory of Open Access Journals (Sweden)
Houda Salhi
2016-01-01
Full Text Available This paper deals with the parameter estimation problem for multivariable nonlinear systems described by MIMO state-space Wiener models. Recursive parameters and state estimation algorithms are presented using the least squares technique, the adjustable model, and the Kalman filter theory. The basic idea is to estimate jointly the parameters, the state vector, and the internal variables of MIMO Wiener models based on a specific decomposition technique to extract the internal vector and avoid problems related to invertibility assumption. The effectiveness of the proposed algorithms is shown by an illustrative simulation example.
Directory of Open Access Journals (Sweden)
J. C. Ochoa-Rivera
2002-01-01
Full Text Available A model for multivariate streamflow generation is presented, based on a multilayer feedforward neural network. The structure of the model results from two components, the neural network (NN deterministic component and a random component which is assumed to be normally distributed. It is from this second component that the model achieves the ability to incorporate effectively the uncertainty associated with hydrological processes, making it valuable as a practical tool for synthetic generation of streamflow series. The NN topology and the corresponding analytical explicit formulation of the model are described in detail. The model is calibrated with a series of monthly inflows to two reservoir sites located in the Tagus River basin (Spain, while validation is performed through estimation of a set of statistics that is relevant for water resources systems planning and management. Among others, drought and storage statistics are computed and compared for both the synthetic and historical series. The performance of the NN-based model was compared to that of a standard autoregressive AR(2 model. Results show that NN represents a promising modelling alternative for simulation purposes, with interesting potential in the context of water resources systems management and optimisation. Keywords: neural networks, perceptron multilayer, error backpropagation, hydrological scenario generation, multivariate time-series..
Forecasting of municipal solid waste quantity in a developing country using multivariate grey models
Energy Technology Data Exchange (ETDEWEB)
Intharathirat, Rotchana, E-mail: rotchana.in@gmail.com [Energy Field of Study, School of Environment, Resources and Development, Asian Institute of Technology, P.O. Box 4, KlongLuang, Pathumthani 12120 (Thailand); Abdul Salam, P., E-mail: salam@ait.ac.th [Energy Field of Study, School of Environment, Resources and Development, Asian Institute of Technology, P.O. Box 4, KlongLuang, Pathumthani 12120 (Thailand); Kumar, S., E-mail: kumar@ait.ac.th [Energy Field of Study, School of Environment, Resources and Development, Asian Institute of Technology, P.O. Box 4, KlongLuang, Pathumthani 12120 (Thailand); Untong, Akarapong, E-mail: akarapong_un@hotmail.com [School of Tourism Development, Maejo University, Chiangmai (Thailand)
2015-05-15
Highlights: • Grey model can be used to forecast MSW quantity accurately with the limited data. • Prediction interval overcomes the uncertainty of MSW forecast effectively. • A multivariate model gives accuracy associated with factors affecting MSW quantity. • Population, urbanization, employment and household size play role for MSW quantity. - Abstract: In order to plan, manage and use municipal solid waste (MSW) in a sustainable way, accurate forecasting of MSW generation and composition plays a key role. It is difficult to carry out the reliable estimates using the existing models due to the limited data available in the developing countries. This study aims to forecast MSW collected in Thailand with prediction interval in long term period by using the optimized multivariate grey model which is the mathematical approach. For multivariate models, the representative factors of residential and commercial sectors affecting waste collected are identified, classified and quantified based on statistics and mathematics of grey system theory. Results show that GMC (1, 5), the grey model with convolution integral, is the most accurate with the least error of 1.16% MAPE. MSW collected would increase 1.40% per year from 43,435–44,994 tonnes per day in 2013 to 55,177–56,735 tonnes per day in 2030. This model also illustrates that population density is the most important factor affecting MSW collected, followed by urbanization, proportion employment and household size, respectively. These mean that the representative factors of commercial sector may affect more MSW collected than that of residential sector. Results can help decision makers to develop the measures and policies of waste management in long term period.
Melo, Tatiane F N; Patriota, Alexandre G
2012-01-01
In this paper, we develop a modified version of the likelihood ratio test for multivariate heteroskedastic errors-in-variables regression models. The error terms are allowed to follow a multivariate distribution in the elliptical class of distributions, which has the normal distribution as a special case. We derive the Skovgaard adjusted likelihood ratio statistic, which follows a chi-squared distribution with a high degree of accuracy. We conduct a simulation study and show that the proposed test displays superior finite sample behavior as compared to the standard likelihood ratio test. We illustrate the usefulness of our results in applied settings using a data set from the WHO MONICA Project on cardiovascular disease.
Multivariable direct adaptive decoupling controller using multiple models and a case study
Institute of Scientific and Technical Information of China (English)
WANG Xin; YANG Hui; ZHENG YiHui
2009-01-01
In this paper, a multivariable direct adaptive controller using multiple models without minimum phase assumption is presented to improve the transient response when the parameters of the system jump abruptly. The controller is composed of multiple fixed controller models, a free-running adaptive controller model and a re-initialized adaptive controller model. The fixed controller models are derived from the corresponding fixed system models directly. The adaptive controller models adopt the direct adaptive algorithm to reduce the design calculation. At every instant, the optimal controller is chosen out according to the switching index. The interaction of the system is viewed as the measured disturbance which is eliminated by the choice of the weighing polynomial matrix. The global convergence is obtained. Finally, several simulation examples in a wind tunnel experiment are given to show both effectiveness and practicality of the proposed method. The significance of the proposed method is that it is applicable to a non-minimum phase system, adopting direct adaptive algorithm to overcome the singularity problem during the matrix calculation and realizing decoupling control for a multivariable system.
Libera, D.; Arumugam, S.
2015-12-01
Water quality observations are usually not available on a continuous basis because of the expensive cost and labor requirements so calibrating and validating a mechanistic model is often difficult. Further, any model predictions inherently have bias (i.e., under/over estimation) and require techniques that preserve the long-term mean monthly attributes. This study suggests and compares two multivariate bias-correction techniques to improve the performance of the SWAT model in predicting daily streamflow, TN Loads across the southeast based on split-sample validation. The first approach is a dimension reduction technique, canonical correlation analysis that regresses the observed multivariate attributes with the SWAT model simulated values. The second approach is from signal processing, importance weighting, that applies a weight based off the ratio of the observed and model densities to the model data to shift the mean, variance, and cross-correlation towards the observed values. These procedures were applied to 3 watersheds chosen from the Water Quality Network in the Southeast Region; specifically watersheds with sufficiently large drainage areas and number of observed data points. The performance of these two approaches are also compared with independent estimates from the USGS LOADEST model. Uncertainties in the bias-corrected estimates due to limited water quality observations are also discussed.
Multivariate model to characterise relations between maize mutant starches and hydrolysis kinetics.
Kansou, Kamal; Buléon, Alain; Gérard, Catherine; Rolland-Sabaté, Agnès
2015-11-20
The many studies about amylolysis have collected considerable information regarding the contribution of the starch physico-chemical properties. But the inherent elaborate and variable structure of granular starch and, consequently, the multifactorial condition of the system hinders the interpretation of the experimental results. The immediate benefit of multivariate statistical analysis approaches with that regard is twofold: considering the factors, possibly interrelated, all together and not independently, providing a first estimation of the magnitude and confidence level of the relations between factors and amylolysis kinetic parameters. Based on data of amylolysis of 13 starch samples from wild type, single and double mutants of maize by porcine pancreatic α-amylase (PPA), a multivariate analysis is proposed. Amylolysis progress-curves were fitted by a Weibull function, as proposed in a previous work, to extract three kinetic parameters: the reaction rate coefficient during the first time-unit, k, the reaction rate retardation over time, h, and the final hydrolysis extent, X∞. Multivariate models relate the macromolecular composition and the fractions of crystalline polymorphic types to the kinetic parameters. h and X∞ are found to be highly related to the measured properties. Thus the amylose content appears to be significantly correlated to the hydrolysis rate retardation, which sheds some light on the probable contribution of the amylose molecules contained in the granules. The multivariate models give correct prediction performances except for k whose a part of variability remains unexplained. A further analysis points out the extent of the characterisation effort of the granule structure needed to extend the fraction of explained variability.
Institute of Scientific and Technical Information of China (English)
Huan-bin Liu; Liu-quan Sun; Li-xing Zhu
2005-01-01
Many survival studies record the times to two or more distinct failures on each subject. The failures may be events of different natures or may be repetitions of the same kind of event. In this article, we consider the regression analysis of such multivariate failure time data under the additive hazards model. Simple weighted estimating functions for the regression parameters are proposed, and asymptotic distribution theory of the resulting estimators are derived. In addition, a class of generalized Wald and generalized score statistics for hypothesis testing and model selection are presented, and the asymptotic properties of these statistics are examined.
Reduced Multivariate Polynomial Model for Manufacturing Costs Estimation of Piping Elements
Directory of Open Access Journals (Sweden)
Nibaldo Rodriguez
2013-01-01
Full Text Available This paper discusses the development and evaluation of an estimation model of manufacturing costs of piping elements through the application of a Reduced Multivariate Polynomial (RMP. The model allows obtaining accurate estimations, even when enough and adequate information is not available. This situation typically occurs in the early stages of the design process of industrial products. The experimental evaluations show that the approach is capable, with a low complexity, of reducing uncertainties and to predict costs with significant precision. Comparisons with a neural network showed also that the RMP performs better considering a set of classical performance measures with the corresponding lower complexity and higher accuracy.
Frank, T D
2002-07-01
Using the method of steps, we describe stochastic processes with delays in terms of Markov diffusion processes. Thus, multivariate Langevin equations and Fokker-Planck equations are derived for stochastic delay differential equations. Natural, periodic, and reflective boundary conditions are discussed. Both Ito and Stratonovich calculus are used. In particular, our Fokker-Planck approach recovers the generalized delay Fokker-Planck equation proposed by Guillouzic et al. The results obtained are applied to a model for population growth: the Gompertz model with delay and multiplicative white noise.
MCKissick, Burnell T. (Technical Monitor); Plassman, Gerald E.; Mall, Gerald H.; Quagliano, John R.
2005-01-01
Linear multivariable regression models for predicting day and night Eddy Dissipation Rate (EDR) from available meteorological data sources are defined and validated. Model definition is based on a combination of 1997-2000 Dallas/Fort Worth (DFW) data sources, EDR from Aircraft Vortex Spacing System (AVOSS) deployment data, and regression variables primarily from corresponding Automated Surface Observation System (ASOS) data. Model validation is accomplished through EDR predictions on a similar combination of 1994-1995 Memphis (MEM) AVOSS and ASOS data. Model forms include an intercept plus a single term of fixed optimal power for each of these regression variables; 30-minute forward averaged mean and variance of near-surface wind speed and temperature, variance of wind direction, and a discrete cloud cover metric. Distinct day and night models, regressing on EDR and the natural log of EDR respectively, yield best performance and avoid model discontinuity over day/night data boundaries.
Wang, Zhe; Li, Lizhi; Ni, Weidou; Li, Zheng
2010-01-01
This paper presents a new approach of applying partial least squares method combined with a physical principle based dominant factor. The characteristic line intensity of the specific element was taken to build up the dominant factor to reflect the major elemental concentration and partial least squares (PLS) approach was then applied to further improve the model accuracy. The deviation evolution of characteristic line intensity from the ideal condition was depicted and according to the deviation understanding, efforts were taken to model the non-linear self-absorption and inter-element interference effects to improve the accuracy of dominant factor model. With a dominant factor to carry the main quantitative information, the novel multivariate model combines advantages of both the conventional univariate and PLS models and partially avoids the overuse of the unrelated noise in the spectrum for PLS application. The dominant factor makes the combination model more robust over a wide concentration range and PLS...
Xu, Cheng-Jian; van der Schaaf, Arjen; Schilstra, Cornelis; Langendijk, Johannes A; van't Veld, Aart A
2012-03-15
To study the impact of different statistical learning methods on the prediction performance of multivariate normal tissue complication probability (NTCP) models. In this study, three learning methods, stepwise selection, least absolute shrinkage and selection operator (LASSO), and Bayesian model averaging (BMA), were used to build NTCP models of xerostomia following radiotherapy treatment for head and neck cancer. Performance of each learning method was evaluated by a repeated cross-validation scheme in order to obtain a fair comparison among methods. It was found that the LASSO and BMA methods produced models with significantly better predictive power than that of the stepwise selection method. Furthermore, the LASSO method yields an easily interpretable model as the stepwise method does, in contrast to the less intuitive BMA method. The commonly used stepwise selection method, which is simple to execute, may be insufficient for NTCP modeling. The LASSO method is recommended. Copyright Â© 2012 Elsevier Inc. All rights reserved.
Energy Technology Data Exchange (ETDEWEB)
Weathers, J.B. [Shock, Noise, and Vibration Group, Northrop Grumman Shipbuilding, P.O. Box 149, Pascagoula, MS 39568 (United States)], E-mail: James.Weathers@ngc.com; Luck, R. [Department of Mechanical Engineering, Mississippi State University, 210 Carpenter Engineering Building, P.O. Box ME, Mississippi State, MS 39762-5925 (United States)], E-mail: Luck@me.msstate.edu; Weathers, J.W. [Structural Analysis Group, Northrop Grumman Shipbuilding, P.O. Box 149, Pascagoula, MS 39568 (United States)], E-mail: Jeffrey.Weathers@ngc.com
2009-11-15
The complexity of mathematical models used by practicing engineers is increasing due to the growing availability of sophisticated mathematical modeling tools and ever-improving computational power. For this reason, the need to define a well-structured process for validating these models against experimental results has become a pressing issue in the engineering community. This validation process is partially characterized by the uncertainties associated with the modeling effort as well as the experimental results. The net impact of the uncertainties on the validation effort is assessed through the 'noise level of the validation procedure', which can be defined as an estimate of the 95% confidence uncertainty bounds for the comparison error between actual experimental results and model-based predictions of the same quantities of interest. Although general descriptions associated with the construction of the noise level using multivariate statistics exists in the literature, a detailed procedure outlining how to account for the systematic and random uncertainties is not available. In this paper, the methodology used to derive the covariance matrix associated with the multivariate normal pdf based on random and systematic uncertainties is examined, and a procedure used to estimate this covariance matrix using Monte Carlo analysis is presented. The covariance matrices are then used to construct approximate 95% confidence constant probability contours associated with comparison error results for a practical example. In addition, the example is used to show the drawbacks of using a first-order sensitivity analysis when nonlinear local sensitivity coefficients exist. Finally, the example is used to show the connection between the noise level of the validation exercise calculated using multivariate and univariate statistics.
Bevacqua, Emanuele; Maraun, Douglas; Hobæk Haff, Ingrid; Widmann, Martin; Vrac, Mathieu
2017-04-01
Compound events are multivariate extreme events in which the individual contributing variables may not be extreme themselves, but their joint - dependent - occurrence causes an extreme impact. The conventional univariate statistical analysis cannot give accurate information regarding the multivariate nature of these events. We develop a conceptual model, implemented via pair-copula constructions, which allows for the quantification of the risk associated with compound events in present day and future climate, as well as the uncertainty estimates around such risk. The model includes meteorological predictors which provide insight into both the involved physical processes, and the temporal variability of CEs. Moreover, this model provides multivariate statistical downscaling of compound events. Downscaling of compound events is required to extend their risk assessment to the past or future climate, where climate models either do not simulate realistic values of the local variables driving the events, or do not simulate them at all. Based on the developed model, we study compound floods, i.e. joint storm surge and high river runoff, in Ravenna (Italy). To explicitly quantify the risk, we define the impact of compound floods as a function of sea and river levels. We use meteorological predictors to extend the analysis to the past, and get a more robust risk analysis. We quantify the uncertainties of the risk analysis observing that they are very large due to the shortness of the available data, though this may also be the case in other studies where they have not been estimated. Ignoring the dependence between sea and river levels would result in an underestimation of risk, in particular the expected return period of the highest compound flood observed increases from about 20 to 32 years when switching from the dependent to the independent case.
A model-based examination of multivariate physical modes in the Gulf of Alaska
Hermann, A. J.; Ladd, C.; Cheng, W.; Curchitser, E. N.; Hedstrom, K.
2016-10-01
We use multivariate output from a hydrodynamic model of the Gulf of Alaska (GOA) to explore the covariance among its physical state and air/sea fluxes. We attempt to summarize this coupled variability using a limited set of patterns, and examine their correlation to three large-scale climate indices relevant to the Northeast Pacific. This analysis is focused on perturbations from monthly climatology of the following attributes of the GOA: sea surface temperature, sea surface height, mixed layer depth, sea surface salinity, latent heat flux, sensible heat flux, shortwave irradiance, net long wave irradiance, currents at 40 m depth, and wind stress. We identified two multivariate modes, both substantially correlated with the Pacific Decadal Oscillation (PDO) and Multivariate El Nino (MEI) indices on interannual timescales, which together account for ~30% of the total normalized variance of the perturbation time series. These two modes indicate the following covarying events during periods of positive PDO/MEI: (1) anomalously warm, wet and windy conditions (typically in winter), with elevated coastal SSH, followed 2-5 months later by (2) reduced cloud cover, with emerging shelf-break eddies. Similar modes are found when the analysis is performed separately on the eastern and western GOA; in general, modal amplitudes appear stronger in the western GOA.
Multivariate soft-modeling to predict radiocesium soil-to-plant transfer.
Rigol, Anna; Camps, Marta; De Juan, Anna; Rauret, Gemma; Vidal, Miquel
2008-06-01
A multivariate soft-modeling approach based on an exploratory principal component analysis (PCA) followed by a partial least squares regression (PLS) was developed, tested, and validated to estimate radiocesium transfer to grass from readily measurable soil characteristics. A data set with 145 soil samples and 21 soil and plant parameters was used. Samples were soils from various field plots contaminated by the Chernobyl accident (soddy-podzolic and peaty soils), submitted to several agricultural treatments (disking, ploughing, fertilization, and liming). Parameters included soil characteristics and the corresponding radiocesium soil-to-plant transfer factors. PCA of data showed that soil samples were grouped according to the field plots and that they covered a wide range of possible soil-to-plant transfer scenarios. PLS was used to design and build the multivariate prediction model. The soil database was split in two parts: (i) a representative calibration set for training purposes and model building and (ii) a prediction set for external validation and model testing. The regression coefficients of the model confirmed the relevant parametersto describe radiocesium soil-to-plant transfer variation (e.g., phyllosilicate content and NH4+ status), which agreed with previous knowledge on the interaction mechanisms of this radionuclide in soils. The prediction of soil-to-plant transfer was satisfactory with an error of the same order of magnitude as the variability of field replicates.
On the Bayesian Treed Multivariate Gaussian Process with Linear Model of Coregionalization
Energy Technology Data Exchange (ETDEWEB)
Konomi, Bledar A.; Karagiannis, Georgios; Lin, Guang
2015-02-01
The Bayesian treed Gaussian process (BTGP) has gained popularity in recent years because it provides a straightforward mechanism for modeling non-stationary data and can alleviate computational demands by fitting models to less data. The extension of BTGP to the multivariate setting requires us to model the cross-covariance and to propose efficient algorithms that can deal with trans-dimensional MCMC moves. In this paper we extend the cross-covariance of the Bayesian treed multivariate Gaussian process (BTMGP) to that of linear model of Coregionalization (LMC) cross-covariances. Different strategies have been developed to improve the MCMC mixing and invert smaller matrices in the Bayesian inference. Moreover, we compare the proposed BTMGP with existing multiple BTGP and BTMGP in test cases and multiphase flow computer experiment in a full scale regenerator of a carbon capture unit. The use of the BTMGP with LMC cross-covariance helped to predict the computer experiments relatively better than existing competitors. The proposed model has a wide variety of applications, such as computer experiments and environmental data. In the case of computer experiments we also develop an adaptive sampling strategy for the BTMGP with LMC cross-covariance function.
Loukas, Constantinos; Georgiou, Evangelos
2013-01-01
There is currently great interest in analyzing the workflow of minimally invasive operations performed in a physical or simulation setting, with the aim of extracting important information that can be used for skills improvement, optimization of intraoperative processes, and comparison of different interventional strategies. The first step in achieving this goal is to segment the operation into its key interventional phases, which is currently approached by modeling a multivariate signal that describes the temporal usage of a predefined set of tools. Although this technique has shown promising results, it is challenged by the manual extraction of the tool usage sequence and the inability to simultaneously evaluate the surgeon's skills. In this paper we describe an alternative methodology for surgical phase segmentation and performance analysis based on Gaussian mixture multivariate autoregressive (GMMAR) models of the hand kinematics. Unlike previous work in this area, our technique employs signals from orientation sensors, attached to the endoscopic instruments of a virtual reality simulator, without considering which tools are employed at each time-step of the operation. First, based on pre-segmented hand motion signals, a training set of regression coefficients is created for each surgical phase using multivariate autoregressive (MAR) models. Then, a signal from a new operation is processed with GMMAR, wherein each phase is modeled by a Gaussian component of regression coefficients. These coefficients are compared to those of the training set. The operation is segmented according to the prior probabilities of the surgical phases estimated via GMMAR. The method also allows for the study of motor behavior and hand motion synchronization demonstrated in each phase, a quality that can be incorporated into modern laparoscopic simulators for skills assessment.
Zhang, Yong; Zhong, Miner; Geng, Nana; Jiang, Yunjian
2017-01-01
The market demand for electric vehicles (EVs) has increased in recent years. Suitable models are necessary to understand and forecast EV sales. This study presents a singular spectrum analysis (SSA) as a univariate time-series model and vector autoregressive model (VAR) as a multivariate model. Empirical results suggest that SSA satisfactorily indicates the evolving trend and provides reasonable results. The VAR model, which comprised exogenous parameters related to the market on a monthly basis, can significantly improve the prediction accuracy. The EV sales in China, which are categorized into battery and plug-in EVs, are predicted in both short term (up to December 2017) and long term (up to 2020), as statistical proofs of the growth of the Chinese EV industry.
Energy consumption and economic growth in New Zealand. Results of trivariate and multivariate models
Energy Technology Data Exchange (ETDEWEB)
Bartleet, Matthew; Gounder, Rukmani [Department of Economics and Finance, Massey University, Palmerston North (New Zealand)
2010-07-15
This study examines the energy consumption-growth nexus in New Zealand. Causal linkages between energy and macroeconomic variables are investigated using trivariate demand-side and multivariate production models. Long run and short run relationships are estimated for the period 1960-2004. The estimated results of demand model reveal a long run relationship between energy consumption, real GDP and energy prices. The short run results indicate that real GDP Granger-causes energy consumption without feedback, consistent with the proposition that energy demand is a derived demand. Energy prices are found to be significant for energy consumption outcomes. Production model results indicate a long run relationship between real GDP, energy consumption and employment. The Granger-causality is found from real GDP to energy consumption, providing additional evidence to support the neoclassical proposition that energy consumption in New Zealand is fundamentally driven by economic activities. Inclusion of capital in the multivariate production model shows short run causality from capital to energy consumption. Also, changes in real GDP and employment have significant predictive power for changes in real capital. (author)
Energy consumption and economic growth in New Zealand: Results of trivariate and multivariate models
Energy Technology Data Exchange (ETDEWEB)
Bartleet, Matthew [Department of Economics and Finance, Massey University, Palmerston North (New Zealand); Gounder, Rukmani, E-mail: R.Gounder@massey.ac.n [Department of Economics and Finance, Massey University, Palmerston North (New Zealand)
2010-07-15
This study examines the energy consumption-growth nexus in New Zealand. Causal linkages between energy and macroeconomic variables are investigated using trivariate demand-side and multivariate production models. Long run and short run relationships are estimated for the period 1960-2004. The estimated results of demand model reveal a long run relationship between energy consumption, real GDP and energy prices. The short run results indicate that real GDP Granger-causes energy consumption without feedback, consistent with the proposition that energy demand is a derived demand. Energy prices are found to be significant for energy consumption outcomes. Production model results indicate a long run relationship between real GDP, energy consumption and employment. The Granger-causality is found from real GDP to energy consumption, providing additional evidence to support the neoclassical proposition that energy consumption in New Zealand is fundamentally driven by economic activities. Inclusion of capital in the multivariate production model shows short run causality from capital to energy consumption. Also, changes in real GDP and employment have significant predictive power for changes in real capital.
Multivariate process modeling of high-volume manufacturing of consumer electronics
Asp, Stefan; Wide, Peter
1998-12-01
As production volumes continue to increase and the global market for consumer electronics is getting fiercer, the need for a reliable and essentially fault-free production process is becoming a necessity to survive. The manufacturing processes of today are highly complex and the increasing amount of process data produced in making it hard to unravel the useful information extracted from a huge data set. We have used multivariate and nonlinear process modeling to examine the surface mount production process in a high volume manufacturing of mobile telephones and made an artificial neural network model of the process. As input parameters to the model we have used process data logged by an automatic test equipment and the result variables come from an Automatic Inspection system placed after the board manufacturing process. Using multivariate process modeling has enabled us to identify parameters, which contributes heavily to the quality of the product and can further be implemented to optimize the manufacturing process for system production faults.
Li, Baoyue; Bruyneel, Luk; Lesaffre, Emmanuel
2014-05-20
A traditional Gaussian hierarchical model assumes a nested multilevel structure for the mean and a constant variance at each level. We propose a Bayesian multivariate multilevel factor model that assumes a multilevel structure for both the mean and the covariance matrix. That is, in addition to a multilevel structure for the mean we also assume that the covariance matrix depends on covariates and random effects. This allows to explore whether the covariance structure depends on the values of the higher levels and as such models heterogeneity in the variances and correlation structure of the multivariate outcome across the higher level values. The approach is applied to the three-dimensional vector of burnout measurements collected on nurses in a large European study to answer the research question whether the covariance matrix of the outcomes depends on recorded system-level features in the organization of nursing care, but also on not-recorded factors that vary with countries, hospitals, and nursing units. Simulations illustrate the performance of our modeling approach. Copyright © 2013 John Wiley & Sons, Ltd.
Yue, Chen; Chen, Shaojie; Sair, Haris I; Airan, Raag; Caffo, Brian S
2015-09-01
Data reproducibility is a critical issue in all scientific experiments. In this manuscript, the problem of quantifying the reproducibility of graphical measurements is considered. The image intra-class correlation coefficient (I2C2) is generalized and the graphical intra-class correlation coefficient (GICC) is proposed for such purpose. The concept for GICC is based on multivariate probit-linear mixed effect models. A Markov Chain Monte Carlo EM (mcm-cEM) algorithm is used for estimating the GICC. Simulation results with varied settings are demonstrated and our method is applied to the KIRBY21 test-retest dataset.
DEFF Research Database (Denmark)
Janssen, Anja; Mikosch, Thomas Valentin; Rezapour, Mohsen
2017-01-01
We consider a multivariate heavy-tailed stochastic volatility model and analyze the large-sample behavior of its sample covariance matrix. We study the limiting behavior of its entries in the infinite-variance case and derive results for the ordered eigenvalues and corresponding eigenvectors...... of the sample covariance matrix. While we show that in the case of heavy-tailed innovations the limiting behavior resembles that of completely independent observations, we also derive that in the case of a heavy-tailed volatility sequence the possible limiting behavior is more diverse, i.e. allowing...
Achtemeier, Gary L.; Ochs, Harry T., III
1988-01-01
The variational method of undetermined multipliers is used to derive a multivariate model for objective analysis. The model is intended for the assimilation of 3-D fields of rawinsonde height, temperature and wind, and mean level temperature observed by satellite into a dynamically consistent data set. Relative measurement errors are taken into account. The dynamic equations are the two nonlinear horizontal momentum equations, the hydrostatic equation, and an integrated continuity equation. The model Euler-Lagrange equations are eleven linear and/or nonlinear partial differential and/or algebraic equations. A cyclical solution sequence is described. Other model features include a nonlinear terrain-following vertical coordinate that eliminates truncation error in the pressure gradient terms of the horizontal momentum equations and easily accommodates satellite observed mean layer temperatures in the middle and upper troposphere. A projection of the pressure gradient onto equivalent pressure surfaces removes most of the adverse impacts of the lower coordinate surface on the variational adjustment.
Educational Technology Funding Models
Mark, Amy E.
2008-01-01
Library and cross-disciplinary literature all stress the increasing importance of instructional technology in higher education. However, there is a dearth of articles detailing funding for library instructional technology. The bulk of library literature on funding for these projects focuses on one-time grant opportunities and on the architecture…
Evaluating Fit Indices for Multivariate t-Based Structural Equation Modeling with Data Contamination
Directory of Open Access Journals (Sweden)
Mark H. C. Lai
2017-07-01
Full Text Available In conventional structural equation modeling (SEM, with the presence of even a tiny amount of data contamination due to outliers or influential observations, normal-theory maximum likelihood (ML-Normal is not efficient and can be severely biased. The multivariate-t-based SEM, which recently got implemented in Mplus as an approach for mixture modeling, represents a robust estimation alternative to downweigh the impact of outliers and influential observations. To our knowledge, the use of maximum likelihood estimation with a multivariate-t model (ML-t to handle outliers has not been shown in SEM literature. In this paper we demonstrate the use of ML-t using the classic Holzinger and Swineford (1939 data set with a few observations modified as outliers or influential observations. A simulation study is then conducted to examine the performance of fit indices and information criteria under ML-Normal and ML-t in the presence of outliers. Results showed that whereas all fit indices got worse for ML-Normal with increasing amount of outliers and influential observations, their values were relatively stable with ML-t, and the use of information criteria was effective in selecting ML-normal without data contamination and selecting ML-t with data contamination, especially when the sample size was at least 200.
Chen, Hsiang-Chun; Wehrly, Thomas E
2015-02-20
The classic concordance correlation coefficient measures the agreement between two variables. In recent studies, concordance correlation coefficients have been generalized to deal with responses from a distribution from the exponential family using the univariate generalized linear mixed model. Multivariate data arise when responses on the same unit are measured repeatedly by several methods. The relationship among these responses is often of interest. In clustered mixed data, the correlation could be present between repeated measurements either within the same observer or between different methods on the same subjects. Indices for measuring such association are needed. This study proposes a series of indices, namely, intra-correlation, inter-correlation, and total correlation coefficients to measure the correlation under various circumstances in a multivariate generalized linear model, especially for joint modeling of clustered count and continuous outcomes. The proposed indices are natural extensions of the concordance correlation coefficient. We demonstrate the methodology with simulation studies. A case example of osteoarthritis study is provided to illustrate the use of these proposed indices. Copyright © 2014 John Wiley & Sons, Ltd.
Berry, Brandon; Moretto, Justin; Matthews, Thomas; Smelko, John; Wiltberger, Kelly
2015-01-01
Multi-component, multi-scale Raman spectroscopy modeling results from a monoclonal antibody producing CHO cell culture process including data from two development scales (3 L, 200 L) and a clinical manufacturing scale environment (2,000 L) are presented. Multivariate analysis principles are a critical component to partial least squares (PLS) modeling but can quickly turn into an overly iterative process, thus a simplified protocol is proposed for addressing necessary steps including spectral preprocessing, spectral region selection, and outlier removal to create models exclusively from cell culture process data without the inclusion of spectral data from chemically defined nutrient solutions or targeted component spiking studies. An array of single-scale and combination-scale modeling iterations were generated to evaluate technology capabilities and model scalability. Analysis of prediction errors across models suggests that glucose, lactate, and osmolality are well modeled. Model strength was confirmed via predictive validation and by examining performance similarity across single-scale and combination-scale models. Additionally, accurate predictive models were attained in most cases for viable cell density and total cell density; however, these components exhibited some scale-dependencies that hindered model quality in cross-scale predictions where only development data was used in calibration. Glutamate and ammonium models were also able to achieve accurate predictions in most cases. However, there are differences in the absolute concentration ranges of these components across the datasets of individual bioreactor scales. Thus, glutamate and ammonium PLS models were forced to extrapolate in cases where models were derived from small scale data only but used in cross-scale applications predicting against manufacturing scale batches. © 2014 American Institute of Chemical Engineers.
Valle, Denis; Baiser, Benjamin; Woodall, Christopher W; Chazdon, Robin
2014-12-01
We propose a novel multivariate method to analyse biodiversity data based on the Latent Dirichlet Allocation (LDA) model. LDA, a probabilistic model, reduces assemblages to sets of distinct component communities. It produces easily interpretable results, can represent abrupt and gradual changes in composition, accommodates missing data and allows for coherent estimates of uncertainty. We illustrate our method using tree data for the eastern United States and from a tropical successional chronosequence. The model is able to detect pervasive declines in the oak community in Minnesota and Indiana, potentially due to fire suppression, increased growing season precipitation and herbivory. The chronosequence analysis is able to delineate clear successional trends in species composition, while also revealing that site-specific factors significantly impact these successional trajectories. The proposed method provides a means to decompose and track the dynamics of species assemblages along temporal and spatial gradients, including effects of global change and forest disturbances.
A multivariate model for the meta-analysis of study level survival data at multiple times.
Jackson, Dan; Rollins, Katie; Coughlin, Patrick
2014-09-01
Motivated by our meta-analytic dataset involving survival rates after treatment for critical leg ischemia, we develop and apply a new multivariate model for the meta-analysis of study level survival data at multiple times. Our data set involves 50 studies that provide mortality rates at up to seven time points, which we model simultaneously, and we compare the results to those obtained from standard methodologies. Our method uses exact binomial within-study distributions and enforces the constraints that both the study specific and the overall mortality rates must not decrease over time. We directly model the probabilities of mortality at each time point, which are the quantities of primary clinical interest. We also present I(2) statistics that quantify the impact of the between-study heterogeneity, which is very considerable in our data set.
Evtushenko, V. F.; Myshlyaev, L. P.; Makarov, G. V.; Ivushkin, K. A.; Burkova, E. V.
2016-10-01
The structure of multi-variant physical and mathematical models of control system is offered as well as its application for adjustment of automatic control system (ACS) of production facilities on the example of coal processing plant.
DEFF Research Database (Denmark)
Jensen, Kasper Lynge; Spliid, Henrik; Toftum, Jørn
2011-01-01
The aim of the current study was to apply multivariate mixed-effects modeling to analyze experimental data on the relation between air quality and the performance of office work. The method estimates in one step the effect of the exposure on a multi-dimensional response variable, and yields impor....... The analysis seems superior to conventional univariate statistics and the information provided may be important for the design of performance experiments in general and for the conclusions that can be based on such studies.......The aim of the current study was to apply multivariate mixed-effects modeling to analyze experimental data on the relation between air quality and the performance of office work. The method estimates in one step the effect of the exposure on a multi-dimensional response variable, and yields...... important information on the correlation between the different dimensions of the response variable, which in this study was composed of both subjective perceptions and a two-dimensional performance task outcome. Such correlation is typically not included in the output from univariate analysis methods. Data...
Multivariate modelling with 1H NMR of pleural effusion in murine cerebral malaria
Directory of Open Access Journals (Sweden)
Ghosh Soumita
2011-11-01
Full Text Available Abstract Background Cerebral malaria is a clinical manifestation of Plasmodium falciparum infection. Although brain damage is the predominant pathophysiological complication of cerebral malaria (CM, respiratory distress, acute lung injury, hydrothorax/pleural effusion are also observed in several cases. Immunological parameters have been assessed in pleural fluid in murine models; however there are no reports of characterization of metabolites present in pleural effusion. Methods 1H NMR of the sera and the pleural effusion of cerebral malaria infected mice were analyzed using principal component analysis, orthogonal partial least square analysis, multiway principal component analysis, and multivariate curve resolution. Results It has been observed that there was 100% occurrence of pleural effusion (PE in the mice affected with CM, as opposed to those are non-cerebral and succumbing to hyperparasitaemia (NCM/HP. An analysis of 1H NMR and SDS-PAGE profile of PE and serum samples of each of the CM mice exhibited a similar profile in terms of constituents. Multivariate analysis on these two classes of biofluids was performed and significant differences were detected in concentrations of metabolites. Glucose, creatine and glutamine contents were high in the PE and lipids being high in the sera. Multivariate curve resolution between sera and pleural effusion showed that changes in PE co-varied with that of serum in CM mice. The increase of glucose in PE is negatively correlated to the glucose in serum in CM as obtained from the result of multiway principal component analysis. Conclusions This study reports for the first time, the characterization of metabolites in pleural effusion formed during murine cerebral malaria. The study indicates that the origin of PE metabolites in murine CM may be the serum. The loss of the components like glucose, glutamine and creatine into the PE may worsen the situation of patients, in conjunction with the enhanced
Nieto, Paulino José García; Antón, Juan Carlos Álvarez; Vilán, José Antonio Vilán; García-Gonzalo, Esperanza
2014-10-01
The aim of this research work is to build a regression model of the particulate matter up to 10 micrometers in size (PM10) by using the multivariate adaptive regression splines (MARS) technique in the Oviedo urban area (Northern Spain) at local scale. This research work explores the use of a nonparametric regression algorithm known as multivariate adaptive regression splines (MARS) which has the ability to approximate the relationship between the inputs and outputs, and express the relationship mathematically. In this sense, hazardous air pollutants or toxic air contaminants refer to any substance that may cause or contribute to an increase in mortality or serious illness, or that may pose a present or potential hazard to human health. To accomplish the objective of this study, the experimental dataset of nitrogen oxides (NOx), carbon monoxide (CO), sulfur dioxide (SO2), ozone (O3) and dust (PM10) were collected over 3 years (2006-2008) and they are used to create a highly nonlinear model of the PM10 in the Oviedo urban nucleus (Northern Spain) based on the MARS technique. One main objective of this model is to obtain a preliminary estimate of the dependence between PM10 pollutant in the Oviedo urban area at local scale. A second aim is to determine the factors with the greatest bearing on air quality with a view to proposing health and lifestyle improvements. The United States National Ambient Air Quality Standards (NAAQS) establishes the limit values of the main pollutants in the atmosphere in order to ensure the health of healthy people. Firstly, this MARS regression model captures the main perception of statistical learning theory in order to obtain a good prediction of the dependence among the main pollutants in the Oviedo urban area. Secondly, the main advantages of MARS are its capacity to produce simple, easy-to-interpret models, its ability to estimate the contributions of the input variables, and its computational efficiency. Finally, on the basis of
Sang, Huiyan
2011-12-01
This paper investigates the cross-correlations across multiple climate model errors. We build a Bayesian hierarchical model that accounts for the spatial dependence of individual models as well as cross-covariances across different climate models. Our method allows for a nonseparable and nonstationary cross-covariance structure. We also present a covariance approximation approach to facilitate the computation in the modeling and analysis of very large multivariate spatial data sets. The covariance approximation consists of two parts: a reduced-rank part to capture the large-scale spatial dependence, and a sparse covariance matrix to correct the small-scale dependence error induced by the reduced rank approximation. We pay special attention to the case that the second part of the approximation has a block-diagonal structure. Simulation results of model fitting and prediction show substantial improvement of the proposed approximation over the predictive process approximation and the independent blocks analysis. We then apply our computational approach to the joint statistical modeling of multiple climate model errors. © 2012 Institute of Mathematical Statistics.
DEFF Research Database (Denmark)
Wu, Yili; Zhang, Dongfeng; Pang, Zengchang;
2015-01-01
Systolic and diastolic blood pressure, pulse pressure (PP), and body mass index (BMI) are heritable traits in human metabolic health but their common genetic and environmental backgrounds are not well investigated. The aim of this article was to explore the phenotypic and genetic associations among...... PP, systolic blood pressure (SBP), diastolic blood pressure (DBP), and BMI. The studied sample contained 615 twin pairs (17-84 years) collected in the Qingdao municipality. Univariate and multivariate structural equation models were fitted for assessing the genetic and environmental contributions...... model estimated (1) high genetic correlations for DBP with SBP (0.87), PP with SBP (0.75); (2) low-moderate genetic correlations between PP and DBP (0.32), each BP component and BMI (0.24-0.37); (3) moderate unique environmental correlation for PP with SBP (0.68) and SBP with DBP (0.63); (4...
A frailty model approach for regression analysis of multivariate current status data.
Chen, Man-Hua; Tong, Xingwei; Sun, Jianguo
2009-11-30
This paper discusses regression analysis of multivariate current status failure time data (The Statistical Analysis of Interval-censoring Failure Time Data. Springer: New York, 2006), which occur quite often in, for example, tumorigenicity experiments and epidemiologic investigations of the natural history of a disease. For the problem, several marginal approaches have been proposed that model each failure time of interest individually (Biometrics 2000; 56:940-943; Statist. Med. 2002; 21:3715-3726). In this paper, we present a full likelihood approach based on the proportional hazards frailty model. For estimation, an Expectation Maximization (EM) algorithm is developed and simulation studies suggest that the presented approach performs well for practical situations. The approach is applied to a set of bivariate current status data arising from a tumorigenicity experiment.
Directory of Open Access Journals (Sweden)
Marco Flôres Ferrão
2010-11-01
Full Text Available In the present work multivariate regression models using interval partial least square (iPLS and backward interval partial least square (biPLS had been analyzed and compared. iPLS and biPLS models had been developed to determine the concentration of biodiesel in blends of biodiesel/diesel using infrared spectroscopy signals. 45 samples with concentrations in range 8-30% of biodiesel, and two distinct spectrophotometers were used. Both the techniques (iPLS and biPLS using the data obtained by HATR-FTIR if had shown promising to develop simpler, faster and non-destructive methodologies for the biodiesel determination in commercial blends.
Pleiotropy analysis of quantitative traits at gene level by multivariate functional linear models.
Wang, Yifan; Liu, Aiyi; Mills, James L; Boehnke, Michael; Wilson, Alexander F; Bailey-Wilson, Joan E; Xiong, Momiao; Wu, Colin O; Fan, Ruzong
2015-05-01
In genetics, pleiotropy describes the genetic effect of a single gene on multiple phenotypic traits. A common approach is to analyze the phenotypic traits separately using univariate analyses and combine the test results through multiple comparisons. This approach may lead to low power. Multivariate functional linear models are developed to connect genetic variant data to multiple quantitative traits adjusting for covariates for a unified analysis. Three types of approximate F-distribution tests based on Pillai-Bartlett trace, Hotelling-Lawley trace, and Wilks's Lambda are introduced to test for association between multiple quantitative traits and multiple genetic variants in one genetic region. The approximate F-distribution tests provide much more significant results than those of F-tests of univariate analysis and optimal sequence kernel association test (SKAT-O). Extensive simulations were performed to evaluate the false positive rates and power performance of the proposed models and tests. We show that the approximate F-distribution tests control the type I error rates very well. Overall, simultaneous analysis of multiple traits can increase power performance compared to an individual test of each trait. The proposed methods were applied to analyze (1) four lipid traits in eight European cohorts, and (2) three biochemical traits in the Trinity Students Study. The approximate F-distribution tests provide much more significant results than those of F-tests of univariate analysis and SKAT-O for the three biochemical traits. The approximate F-distribution tests of the proposed functional linear models are more sensitive than those of the traditional multivariate linear models that in turn are more sensitive than SKAT-O in the univariate case. The analysis of the four lipid traits and the three biochemical traits detects more association than SKAT-O in the univariate case.
Directory of Open Access Journals (Sweden)
Zhifeng Zhong
2017-01-01
Full Text Available Owing to the environment, temperature, and so forth, photovoltaic power generation volume is always fluctuating and subsequently impacts power grid planning and operation seriously. Therefore, it is of great importance to make accurate prediction of the power generation of photovoltaic (PV system in advance. In order to improve the prediction accuracy, in this paper, a novel particle swarm optimization algorithm based multivariable grey theory model is proposed for short-term photovoltaic power generation volume forecasting. It is highlighted that, by integrating particle swarm optimization algorithm, the prediction accuracy of grey theory model is expected to be highly improved. In addition, large amounts of real data from two separate power stations in China are being employed for model verification. The experimental results indicate that, compared with the conventional grey model, the mean relative error in the proposed model has been reduced from 7.14% to 3.53%. The real practice demonstrates that the proposed optimization model outperforms the conventional grey model from both theoretical and practical perspectives.
Multi-variate spatial explicit constraining of a large scale hydrological model
Rakovec, Oldrich; Kumar, Rohini; Samaniego, Luis
2016-04-01
Increased availability and quality of near real-time data should target at better understanding of predictive skills of distributed hydrological models. Nevertheless, predictions of regional scale water fluxes and states remains of great challenge to the scientific community. Large scale hydrological models are used for prediction of soil moisture, evapotranspiration and other related water states and fluxes. They are usually properly constrained against river discharge, which is an integral variable. Rakovec et al (2016) recently demonstrated that constraining model parameters against river discharge is necessary, but not a sufficient condition. Therefore, we further aim at scrutinizing appropriate incorporation of readily available information into a hydrological model that may help to improve the realism of hydrological processes. It is important to analyze how complementary datasets besides observed streamflow and related signature measures can improve model skill of internal model variables during parameter estimation. Among those products suitable for further scrutiny are for example the GRACE satellite observations. Recent developments of using this dataset in a multivariate fashion to complement traditionally used streamflow data within the distributed model mHM (www.ufz.de/mhm) are presented. Study domain consists of 80 European basins, which cover a wide range of distinct physiographic and hydrologic regimes. First-order data quality check ensures that heavily human influenced basins are eliminated. For river discharge simulations we show that model performance of discharge remains unchanged when complemented by information from the GRACE product (both, daily and monthly time steps). Moreover, the GRACE complementary data lead to consistent and statistically significant improvements in evapotranspiration estimates, which are evaluated using an independent gridded FLUXNET product. We also show that the choice of the objective function used to estimate
A general multivariate qualitative model for sizing stand-alone photovoltaic systems
Energy Technology Data Exchange (ETDEWEB)
Sidrach-de-Cardona, M. [Dpto. Fisica Aplicada II, E.T.S.I. Informatica, Universidad de Malaga, 29071 Malaga (Spain); Mora Lopez, L. [Dpto. Lenguajes y C. Computacion, E.T.S.I. Informatica, Universidad de Malaga, 29071 Malaga (Spain)
1999-10-01
We considered a general model for sizing a stand-alone photovoltaic system, using as energy input data the information available in any irradiation atlas. The parameters of the model are estimated by multivariate linear regression. The results obtained from the numerical loss of load probability size method (LOLP) were used as initial input data to fit the mode. For this fit we have used daily global irradiation data taken from 222 US meteorological stations for the period 1961-1990. The expression proposed allows us to determine the photovoltaic array size, with a coefficient of determination to 0.96. This coefficient is independent of the used LOLP value. System parameters and mean monthly values for daily global irradiation on the modules surface are taken as independent variables in the model. It also shows that the proposed model can be used with the same accuracy for other locations not considered in the estimation of the model. We also propose a model which would allow us to calculate optimum tilts for the array surface taking the latitude into account as well as the variability of the incident irradiation.
Krohn, Olivia; Armbruster, Aaron; Gao, Yongsheng; Atlas Collaboration
2017-01-01
Software tools developed for the purpose of modeling CERN LHC pp collision data to aid in its interpretation are presented. Some measurements are not adequately described by a Gaussian distribution; thus an interpretation assuming Gaussian uncertainties will inevitably introduce bias, necessitating analytical tools to recreate and evaluate non-Gaussian features. One example is the measurements of Higgs boson production rates in different decay channels, and the interpretation of these measurements. The ratios of data to Standard Model expectations (μ) for five arbitrary signals were modeled by building five Poisson distributions with mixed signal contributions such that the measured values of μ are correlated. Algorithms were designed to recreate probability distribution functions of μ as multi-variate Gaussians, where the standard deviation (σ) and correlation coefficients (ρ) are parametrized. There was good success with modeling 1-D likelihood contours of μ, and the multi-dimensional distributions were well modeled within 1- σ but the model began to diverge after 2- σ due to unmerited assumptions in developing ρ. Future plans to improve the algorithms and develop a user-friendly analysis package will also be discussed. NSF International Research Experiences for Students
Analysis of Multivariate Experimental Data Using A Simplified Regression Model Search Algorithm
Ulbrich, Norbert Manfred
2013-01-01
A new regression model search algorithm was developed in 2011 that may be used to analyze both general multivariate experimental data sets and wind tunnel strain-gage balance calibration data. The new algorithm is a simplified version of a more complex search algorithm that was originally developed at the NASA Ames Balance Calibration Laboratory. The new algorithm has the advantage that it needs only about one tenth of the original algorithm's CPU time for the completion of a search. In addition, extensive testing showed that the prediction accuracy of math models obtained from the simplified algorithm is similar to the prediction accuracy of math models obtained from the original algorithm. The simplified algorithm, however, cannot guarantee that search constraints related to a set of statistical quality requirements are always satisfied in the optimized regression models. Therefore, the simplified search algorithm is not intended to replace the original search algorithm. Instead, it may be used to generate an alternate optimized regression model of experimental data whenever the application of the original search algorithm either fails or requires too much CPU time. Data from a machine calibration of NASA's MK40 force balance is used to illustrate the application of the new regression model search algorithm.
Hampton, Stephanie E; Holmes, Elizabeth E; Scheef, Lindsay P; Scheuerell, Mark D; Katz, Stephen L; Pendleton, Daniel E; Ward, Eric J
2013-12-01
Long-term ecological data sets present opportunities for identifying drivers of community dynamics and quantifying their effects through time series analysis. Multivariate autoregressive (MAR) models are well known in many other disciplines, such as econometrics, but widespread adoption of MAR methods in ecology and natural resource management has been much slower despite some widely cited ecological examples. Here we review previous ecological applications of MAR models and highlight their ability to identify abiotic and biotic drivers of population dynamics, as well as community-level stability metrics, from long-term empirical observations. Thus far, MAR models have been used mainly with data from freshwater plankton communities; we examine the obstacles that may be hindering adoption in other systems and suggest practical modifications that will improve MAR models for broader application. Many of these modifications are already well known in other fields in which MAR models are common, although they are frequently described under different names. In an effort to make MAR models more accessible to ecologists, we include a worked example using recently developed R packages (MAR1 and MARSS), freely available and open-access software.
Modeling and predicting spoilage of cooked, cured meat products by multivariate analysis.
Mataragas, Marios; Skandamis, Panagiotis; Nychas, George-John E; Drosinos, Eleftherios H
2007-11-01
A cooked, cured meat product is a perishable product spoiled mainly by lactic acid bacteria (LAB). LAB cause discoloration, slime formation, off-odors and off-flavors as the result of their metabolic activity producing various products. These microbial products in conjunction with the microbial population could be used to assess the degree of spoilage of this type of product. The spoilage evaluation was achieved by following a multivariate approach. Cluster analysis, principal component analysis and partial least square regression were employed to associate spoilage with microbiological and physicochemical parameters. The developed model was capable of giving accurate predictions of spoilage describing the spoilage associations. The study might contribute to the improvement of quality assurance systems of meat enterprises.
Harinath, Eranda; Mann, George K I
2008-06-01
This paper describes a design and two-level tuning method for fuzzy proportional-integral derivative (FPID) controllers for a multivariable process where the fuzzy inference uses the inference of standard additive model. The proposed method can be used for any n x n multi-input-multi-output process and guarantees closed-loop stability. In the two-level tuning scheme, the tuning follows two steps: low-level tuning followed by high-level tuning. The low-level tuning adjusts apparent linear gains, whereas the high-level tuning changes the nonlinearity in the normalized fuzzy output. In this paper, two types of FPID configurations are considered, and their performances are evaluated by using a real-time multizone temperature control problem having a 3 x 3 process system.
Probability modeling for robustness of multivariate LQG designing based on ship lateral motion
Institute of Scientific and Technical Information of China (English)
无
2005-01-01
The robustness of LQG designing for latitudinal movement of ship is mainly discussed, when its hydrodynamic parameters fluctuate around criterion value at random on the proportional distributing. When a given ship state at the speed of 18 kn and the course of 45° under Rank 5 state of sea, and the hydrodynamic parameters of the ship fluctuate at random on the proportional distributing with a range of ±10%,±20%,±30%, the robustness of multivariate LQG designing for ship is analyzed with applying the probability modeling of relative controlling effect. The result of simulating shows that when the hydrodynamic parameters of ship fluctuates the relative controlling effect of the LQG designing submit to normal distribution and the mean value of relative controlling effect has no remarkable changes comparing to that without perturbation of hydrodynamic parameter.
Directory of Open Access Journals (Sweden)
Vasios C.E.
2003-01-01
Full Text Available In the present work, a new method for the classification of Event Related Potentials (ERPs is proposed. The proposed method consists of two modules: the feature extraction module and the classification module. The feature extraction module comprises the implementation of the Multivariate Autoregressive model in conjunction with the Simulated Annealing technique, for the selection of optimum features from ERPs. The classification module is implemented with a single three-layer neural network, trained with the back-propagation algorithm and classifies the data into two classes: patients and control subjects. The method, in the form of a Decision Support System (DSS, has been thoroughly tested to a number of patient data (OCD, FES, depressives and drug users, resulting successful classification up to 100%.
A note on constrained M-estimation and its recursive analog in multivariate linear regression models
Institute of Scientific and Technical Information of China (English)
RAO; Calyampudi; R
2009-01-01
In this paper,the constrained M-estimation of the regression coeffcients and scatter parameters in a general multivariate linear regression model is considered.Since the constrained M-estimation is not easy to compute,an up-dating recursion procedure is proposed to simplify the com-putation of the estimators when a new observation is obtained.We show that,under mild conditions,the recursion estimates are strongly consistent.In addition,the asymptotic normality of the recursive constrained M-estimators of regression coeffcients is established.A Monte Carlo simulation study of the recursion estimates is also provided.Besides,robustness and asymptotic behavior of constrained M-estimators are briefly discussed.
Application of multivariate storage model to quantify trends in seasonally frozen soil
Directory of Open Access Journals (Sweden)
Woody Jonathan
2016-06-01
Full Text Available This article presents a study of the ground thermal regime recorded at 11 stations in the North Dakota Agricultural Network. Particular focus is placed on detecting trends in the annual ground freeze process portion of the ground thermal regime’s daily temperature signature. A multivariate storage model from queuing theory is fit to a quantity of estimated daily depths of frozen soil. Statistical inference on a trend parameter is obtained by minimizing a weighted sum of squares of a sequence of daily one-step-ahead predictions. Standard errors for the trend estimates are presented. It is shown that the daily quantity of frozen ground experienced at these 11 sites exhibited a negative trend over the observation period.
Application of multivariate storage model to quantify trends in seasonally frozen soil
Woody, Jonathan; Wang, Yan; Dyer, Jamie
2016-06-01
This article presents a study of the ground thermal regime recorded at 11 stations in the North Dakota Agricultural Network. Particular focus is placed on detecting trends in the annual ground freeze process portion of the ground thermal regime's daily temperature signature. A multivariate storage model from queuing theory is fit to a quantity of estimated daily depths of frozen soil. Statistical inference on a trend parameter is obtained by minimizing a weighted sum of squares of a sequence of daily one-step-ahead predictions. Standard errors for the trend estimates are presented. It is shown that the daily quantity of frozen ground experienced at these 11 sites exhibited a negative trend over the observation period.
Giacomo, Della Riccia; Stefania, Del Zotto
2013-12-15
Fumonisins are mycotoxins produced by Fusarium species that commonly live in maize. Whereas fungi damage plants, fumonisins cause disease both to cattle breedings and human beings. Law limits set fumonisins tolerable daily intake with respect to several maize based feed and food. Chemical techniques assure the most reliable and accurate measurements, but they are expensive and time consuming. A method based on Near Infrared spectroscopy and multivariate statistical regression is described as a simpler, cheaper and faster alternative. We apply Partial Least Squares with full cross validation. Two models are described, having high correlation of calibration (0.995, 0.998) and of validation (0.908, 0.909), respectively. Description of observed phenomenon is accurate and overfitting is avoided. Screening of contaminated maize with respect to European legal limit of 4 mg kg(-1) should be assured.
Improved modeling of multivariate measurement errors based on the Wishart distribution.
Wentzell, Peter D; Cleary, Cody S; Kompany-Zareh, M
2017-03-22
The error covariance matrix (ECM) is an important tool for characterizing the errors from multivariate measurements, representing both the variance and covariance in the errors across multiple channels. Such information is useful in understanding and minimizing sources of experimental error and in the selection of optimal data analysis procedures. Experimental ECMs, normally obtained through replication, are inherently noisy, inconvenient to obtain, and offer limited interpretability. Significant advantages can be realized by building a model for the ECM based on established error types. Such models are less noisy, reduce the need for replication, mitigate mathematical complications such as matrix singularity, and provide greater insights. While the fitting of ECM models using least squares has been previously proposed, the present work establishes that fitting based on the Wishart distribution offers a much better approach. Simulation studies show that the Wishart method results in parameter estimates with a smaller variance and also facilitates the statistical testing of alternative models using a parameterized bootstrap method. The new approach is applied to fluorescence emission data to establish the acceptability of various models containing error terms related to offset, multiplicative offset, shot noise and uniform independent noise. The implications of the number of replicates, as well as single vs. multiple replicate sets are also described.
Directory of Open Access Journals (Sweden)
Yoonsu Shin
2016-01-01
Full Text Available In the 5G era, the operational cost of mobile wireless networks will significantly increase. Further, massive network capacity and zero latency will be needed because everything will be connected to mobile networks. Thus, self-organizing networks (SON are needed, which expedite automatic operation of mobile wireless networks, but have challenges to satisfy the 5G requirements. Therefore, researchers have proposed a framework to empower SON using big data. The recent framework of a big data-empowered SON analyzes the relationship between key performance indicators (KPIs and related network parameters (NPs using machine-learning tools, and it develops regression models using a Gaussian process with those parameters. The problem, however, is that the methods of finding the NPs related to the KPIs differ individually. Moreover, the Gaussian process regression model cannot determine the relationship between a KPI and its various related NPs. In this paper, to solve these problems, we proposed multivariate multiple regression models to determine the relationship between various KPIs and NPs. If we assume one KPI and multiple NPs as one set, the proposed models help us process multiple sets at one time. Also, we can find out whether some KPIs are conflicting or not. We implement the proposed models using MapReduce.
A Semi-parametric Multivariate Gap-filling Model for Eddy Covariance Latent Heat Flux
Li, M.; Chen, Y.
2010-12-01
Quantitative descriptions of latent heat fluxes are important to study the water and energy exchanges between terrestrial ecosystems and the atmosphere. The eddy covariance approaches have been recognized as the most reliable technique for measuring surface fluxes over time scales ranging from hours to years. However, unfavorable micrometeorological conditions, instrument failures, and applicable measurement limitations may cause inevitable flux gaps in time series data. Development and application of suitable gap-filling techniques are crucial to estimate long term fluxes. In this study, a semi-parametric multivariate gap-filling model was developed to fill latent heat flux gaps for eddy covariance measurements. Our approach combines the advantages of a multivariate statistical analysis (principal component analysis, PCA) and a nonlinear interpolation technique (K-nearest-neighbors, KNN). The PCA method was first used to resolve the multicollinearity relationships among various hydrometeorological factors, such as radiation, soil moisture deficit, LAI, and wind speed. The KNN method was then applied as a nonlinear interpolation tool to estimate the flux gaps as the weighted sum latent heat fluxes with the K-nearest distances in the PCs’ domain. Two years, 2008 and 2009, of eddy covariance and hydrometeorological data from a subtropical mixed evergreen forest (the Lien-Hua-Chih Site) were collected to calibrate and validate the proposed approach with artificial gaps after standard QC/QA procedures. The optimal K values and weighting factors were determined by the maximum likelihood test. The results of gap-filled latent heat fluxes conclude that developed model successful preserving energy balances of daily, monthly, and yearly time scales. Annual amounts of evapotranspiration from this study forest were 747 mm and 708 mm for 2008 and 2009, respectively. Nocturnal evapotranspiration was estimated with filled gaps and results are comparable with other studies
Wang, Zhe; Li, Lizhi; Ni, Weidou; Li, Zheng
2011-01-01
A multivariate dominant factor based non-linearized PLS model is proposed. The intensities of different lines were taken to construct a multivariate dominant factor model, which describes the dominant concentration information of the measured species. In constructing such a multivariate model, non-linear transformation of multi characteristic line intensities according to the physical mechanisms of lased induced plasma spectrum were made, combined with linear-correlation-based PLS method, to model the nonlinear self-absorption and inter-element interference effects. This enables the linear PLS method to describe non-linear relationship more accurately and provides the statistics-based PLS method with physical backgrounds. Moreover, a secondary PLS is applied utilizing the whole spectra information to further correct the model results. Experiments were conducted using standard brass samples. Taylor expansion was applied to make the nonlinear transformation to describe the self-absorption effect of Cu. Then, li...
Prats-Montalbán, José M.; López, Fernando; Valiente, José M.; Ferrer, Alberto
2007-01-01
In this paper we present an innovative way to simultaneously perform feature extraction and classification for the quality control issue of surface grading by applying two well known multivariate statistical projection tools (SIMCA and PLS-DA). These tools have been applied to compress the color texture data describing the visual appearance of surfaces (soft color texture descriptors) and to directly perform classification using statistics and predictions computed from the extracted projection models. Experiments have been carried out using an extensive image database of ceramic tiles (VxC TSG). This image database is comprised of 14 different models, 42 surface classes and 960 pieces. A factorial experimental design has been carried out to evaluate all the combinations of several factors affecting the accuracy rate. Factors include tile model, color representation scheme (CIE Lab, CIE Luv and RGB) and compression/classification approach (SIMCA and PLS-DA). In addition, a logistic regression model is fitted from the experiments to compute accuracy estimates and study the factors effect. The results show that PLS-DA performs better than SIMCA, achieving a mean accuracy rate of 98.95%. These results outperform those obtained in a previous work where the soft color texture descriptors in combination with the CIE Lab color space and the k-NN classi.er achieved a 97.36% of accuracy.
Directory of Open Access Journals (Sweden)
Marcia Werlang
2008-08-01
Full Text Available Discrete wavelet transform (DWT Daubecheis was used to compress the dimension of spectral infrared data for determination to the hydroxyl value (OHV of soybean polyols samples. Spectral data were recorded between 650 and 4000 cm-1 with a 4 cm-1 resolution by Fourier transform infrared spectroscopy (FTIR coupled with attenuated total reflection (ATR accessory. Through the models of regression using partial least squares (PLS and interval partial least squares (iPLS methods, the performance of each was compared with the original and/or between them. The spectra data set compressed the 1/4 of its original dimension they had presented the best one resulted with a lesser RMSEP that the model with the not compress signal and a similar correlation. With this result a model of lesser dimension was gotten however with the same capacity, thus DWT, getting a robust method for the reduction of the dimension of the spectra data sets, when if to intend to construct regression multivariate models.
Development of a scale down cell culture model using multivariate analysis as a qualification tool.
Tsang, Valerie Liu; Wang, Angela X; Yusuf-Makagiansar, Helena; Ryll, Thomas
2014-01-01
In characterizing a cell culture process to support regulatory activities such as process validation and Quality by Design, developing a representative scale down model for design space definition is of great importance. The manufacturing bioreactor should ideally reproduce bench scale performance with respect to all measurable parameters. However, due to intrinsic geometric differences between scales, process performance at manufacturing scale often varies from bench scale performance, typically exhibiting differences in parameters such as cell growth, protein productivity, and/or dissolved carbon dioxide concentration. Here, we describe a case study in which a bench scale cell culture process model is developed to mimic historical manufacturing scale performance for a late stage CHO-based monoclonal antibody program. Using multivariate analysis (MVA) as primary data analysis tool in addition to traditional univariate analysis techniques to identify gaps between scales, process adjustments were implemented at bench scale resulting in an improved scale down cell culture process model. Finally we propose an approach for small scale model qualification including three main aspects: MVA, comparison of key physiological rates, and comparison of product quality attributes.
Cheng, Qing; Lu, Xin; Wu, Joseph T.; Liu, Zhong; Huang, Jincai
2016-01-01
Guangdong experienced the largest dengue epidemic in recent history. In 2014, the number of dengue cases was the highest in the previous 10 years and comprised more than 90% of all cases. In order to analyze heterogeneous transmission of dengue, a multivariate time series model decomposing dengue risk additively into endemic, autoregressive and spatiotemporal components was used to model dengue transmission. Moreover, random effects were introduced in the model to deal with heterogeneous dengue transmission and incidence levels and power law approach was embedded into the model to account for spatial interaction. There was little spatial variation in the autoregressive component. In contrast, for the endemic component, there was a pronounced heterogeneity between the Pearl River Delta area and the remaining districts. For the spatiotemporal component, there was considerable heterogeneity across districts with highest values in some western and eastern department. The results showed that the patterns driving dengue transmission were found by using clustering analysis. And endemic component contribution seems to be important in the Pearl River Delta area, where the incidence is high (95 per 100,000), while areas with relatively low incidence (4 per 100,000) are highly dependent on spatiotemporal spread and local autoregression. PMID:27666657
Persuasive Technology and Business Models
DEFF Research Database (Denmark)
Søndergaard, Morten Karnøe; Lindgren, Peter; Veirum, Niels Einar
specific behavior, this results to the ability of designing for specific changes. Businesses use different persuasive technologies to persuade users, customers and network partners to change behavior. Operating more than one value proposition, both tangible and intangible value proposition, in combination...... seems to be crucial to the success of a persuasive business model. We will give a short introduction into the area of persuasive technology and business models. Moreover, we will present a number of concrete case examples where persuasive technologies were employed, the first in health care, the second...
Directory of Open Access Journals (Sweden)
Eric H Y Lau
Full Text Available BACKGROUND: Multiple sources of influenza surveillance data are becoming more available; however integration of these data streams for situational awareness of influenza activity is less explored. METHODS AND RESULTS: We applied multivariate time-series methods to sentinel outpatient and school absenteeism surveillance data in Hong Kong during 2004-2009. School absenteeism data and outpatient surveillance data experienced interruptions due to school holidays and changes in public health guidelines during the pandemic, including school closures and the establishment of special designated flu clinics, which in turn provided 'drop-in' fever counts surveillance data. A multivariate dynamic linear model was used to monitor influenza activity throughout epidemics based on all available data. The inferred level followed influenza activity closely at different times, while the inferred trend was less competent with low influenza activity. Correlations between inferred level and trend from the multivariate model and reference influenza activity, measured by the product of weekly laboratory influenza detection rates and weekly general practitioner influenza-like illness consultation rates, were calculated and compared with those from univariate models. Over the whole study period, there was a significantly higher correlation (ρ = 0.82, p≤0.02 for the inferred trend based on the multivariate model compared to other univariate models, while the inferred trend from the multivariate model performed as well as the best univariate model in the pre-pandemic and the pandemic period. The inferred trend and level from the multivariate model was able to match, if not outperform, the best univariate model albeit with missing data plus drop-in and drop-out of different surveillance data streams. An overall influenza index combining level and trend was constructed to demonstrate another potential use of the method. CONCLUSIONS: Our results demonstrate the
Ben Alaya, M. A.; Chebana, F.; Ouarda, T. B. M. J.
2016-09-01
Statistical downscaling techniques are required to refine atmosphere-ocean global climate data and provide reliable meteorological information such as a realistic temporal variability and relationships between sites and variables in a changing climate. To this end, the present paper introduces a modular structure combining two statistical tools of increasing interest during the last years: (1) Gaussian copula and (2) quantile regression. The quantile regression tool is employed to specify the entire conditional distribution of downscaled variables and to address the limitations of traditional regression-based approaches whereas the Gaussian copula is performed to describe and preserve the dependence between both variables and sites. A case study based on precipitation and maximum and minimum temperatures from the province of Quebec, Canada, is used to evaluate the performance of the proposed model. Obtained results suggest that this approach is capable of generating series with realistic correlation structures and temporal variability. Furthermore, the proposed model performed better than a classical multisite multivariate statistical downscaling model for most evaluation criteria.
Nieto, P J García; Antón, J C Álvarez; Vilán, J A Vilán; García-Gonzalo, E
2015-05-01
The aim of this research work is to build a regression model of air quality by using the multivariate adaptive regression splines (MARS) technique in the Oviedo urban area (northern Spain) at a local scale. To accomplish the objective of this study, the experimental data set made up of nitrogen oxides (NO x ), carbon monoxide (CO), sulfur dioxide (SO2), ozone (O3), and dust (PM10) was collected over 3 years (2006-2008). The US National Ambient Air Quality Standards (NAAQS) establishes the limit values of the main pollutants in the atmosphere in order to ensure the health of healthy people. Firstly, this MARS regression model captures the main perception of statistical learning theory in order to obtain a good prediction of the dependence among the main pollutants in the Oviedo urban area. Secondly, the main advantages of MARS are its capacity to produce simple, easy-to-interpret models, its ability to estimate the contributions of the input variables, and its computational efficiency. Finally, on the basis of these numerical calculations, using the MARS technique, conclusions of this research work are exposed.
Multivariate Autoregressive Model Based Heart Motion Prediction Approach for Beating Heart Surgery
Directory of Open Access Journals (Sweden)
Fan Liang
2013-02-01
Full Text Available A robotic tool can enable a surgeon to conduct off-pump coronary artery graft bypass surgery on a beating heart. The robotic tool actively alleviates the relative motion between the point of interest (POI on the heart surface and the surgical tool and allows the surgeon to operate as if the heart were stationary. Since the beating heart's motion is relatively high-band, with nonlinear and nonstationary characteristics, it is difficult to follow. Thus, precise beating heart motion prediction is necessary for the tracking control procedure during the surgery. In the research presented here, we first observe that Electrocardiography (ECG signal contains the causal phase information on heart motion and non-stationary heart rate dynamic variations. Then, we investigate the relationship between ECG signal and beating heart motion using Granger Causality Analysis, which describes the feasibility of the improved prediction of heart motion. Next, we propose a nonlinear time-varying multivariate vector autoregressive (MVAR model based adaptive prediction method. In this model, the significant correlation between ECG and heart motion enables the improvement of the prediction of sharp changes in heart motion and the approximation of the motion with sufficient detail. Dual Kalman Filters (DKF estimate the states and parameters of the model, respectively. Last, we evaluate the proposed algorithm through comparative experiments using the two sets of collected vivo data.
The Detection of Metabolite-Mediated Gene Module Co-Expression Using Multivariate Linear Models.
Directory of Open Access Journals (Sweden)
Trishanta Padayachee
Full Text Available Investigating whether metabolites regulate the co-expression of a predefined gene module is one of the relevant questions posed in the integrative analysis of metabolomic and transcriptomic data. This article concerns the integrative analysis of the two high-dimensional datasets by means of multivariate models and statistical tests for the dependence between metabolites and the co-expression of a gene module. The general linear model (GLM for correlated data that we propose models the dependence between adjusted gene expression values through a block-diagonal variance-covariance structure formed by metabolic-subset specific general variance-covariance blocks. Performance of statistical tests for the inference of conditional co-expression are evaluated through a simulation study. The proposed methodology is applied to the gene expression data of the previously characterized lipid-leukocyte module. Our results show that the GLM approach improves on a previous approach by being less prone to the detection of spurious conditional co-expression.
Update on Multi-Variable Parametric Cost Models for Ground and Space Telescopes
Stahl, H. Philip; Henrichs, Todd; Luedtke, Alexander; West, Miranda
2012-01-01
Parametric cost models can be used by designers and project managers to perform relative cost comparisons between major architectural cost drivers and allow high-level design trades; enable cost-benefit analysis for technology development investment; and, provide a basis for estimating total project cost between related concepts. This paper reports on recent revisions and improvements to our ground telescope cost model and refinements of our understanding of space telescope cost models. One interesting observation is that while space telescopes are 50X to 100X more expensive than ground telescopes, their respective scaling relationships are similar. Another interesting speculation is that the role of technology development may be different between ground and space telescopes. For ground telescopes, the data indicates that technology development tends to reduce cost by approximately 50% every 20 years. But for space telescopes, there appears to be no such cost reduction because we do not tend to re-fly similar systems. Thus, instead of reducing cost, 20 years of technology development may be required to enable a doubling of space telescope capability. Other findings include: mass should not be used to estimate cost; spacecraft and science instrument costs account for approximately 50% of total mission cost; and, integration and testing accounts for only about 10% of total mission cost.
Zhu, Yuda; Weiss, Robert E
2013-03-01
Longitudinal behavioral intervention trials to reduce HIV transmission risk collect complex multilevel and multivariate data longitudinally for each subject with important correlation structures across time, level, and variables. Accurately assessing the effects of these trials are critical for determining which interventions are effective. Both numbers of partners and numbers of sex acts with each partner are reported at each time point. Sex acts with each partner are further differentiated into protected and unprotected acts with correspondingly differing risks of HIV/STD transmission. These trials generally also have eligibility criteria limiting enrollment to participants with some minimal level of risky sexual behavior tied directly to the outcome of interest. The combination of these factors makes it difficult to quantify sexual behaviors and the effects of intervention. We propose a multivariate multilevel count model that simultaneously models the number of partners, acts within partners, and accounts for recruitment eligibility. Our methods are useful in the evaluation of intervention trials and provide a more accurate and complete model for sexual behavior. We illustrate the contributions of our model by examining seroadaptive behavior defined as risk reducing behavior that depends on the serostatus of the partner. Several forms of seroadaptive risk reducing behavior are quantified and distinguished from nonseroadaptive risk reducing behavior. Copyright © 2013, The International Biometric Society.
Sakaguchi, Kaori; Nagatsuma, Tsutomu; Reeves, Geoffrey D.; Spence, Harlan E.
2015-12-01
The Van Allen radiation belts surrounding the Earth are filled with MeV-energy electrons. This region poses ionizing radiation risks for spacecraft that operate within it, including those in geostationary orbit (GEO) and medium Earth orbit. To provide alerts of electron flux enhancements, 16 prediction models of the electron log-flux variation throughout the equatorial outer radiation belt as a function of the McIlwain L parameter were developed using the multivariate autoregressive model and Kalman filter. Measurements of omnidirectional 2.3 MeV electron flux from the Van Allen Probes mission as well as >2 MeV electrons from the GOES 15 spacecraft were used as the predictors. Model explanatory parameters were selected from solar wind parameters, the electron log-flux at GEO, and geomagnetic indices. For the innermost region of the outer radiation belt, the electron flux is best predicted by using the Dst index as the sole input parameter. For the central to outermost regions, at L ≧ 4.8 and L ≧ 5.6, the electron flux is predicted most accurately by including also the solar wind velocity and then the dynamic pressure, respectively. The Dst index is the best overall single parameter for predicting at 3 ≦ L ≦ 6, while for the GEO flux prediction, the KP index is better than Dst. A test calculation demonstrates that the model successfully predicts the timing and location of the flux maximum as much as 2 days in advance and that the electron flux decreases faster with time at higher L values, both model features consistent with the actually observed behavior.
Xu, Peng; Rizzoni, Elizabeth Anne; Sul, Se-Yeong; Stephanopoulos, Gregory
2017-01-20
Metabolic engineering entails target modification of cell metabolism to maximize the production of a specific compound. For empowering combinatorial optimization in strain engineering, tools and algorithms are needed to efficiently sample the multidimensional gene expression space and locate the desirable overproduction phenotype. We addressed this challenge by employing design of experiment (DoE) models to quantitatively correlate gene expression with strain performance. By fractionally sampling the gene expression landscape, we statistically screened the dominant enzyme targets that determine metabolic pathway efficiency. An empirical quadratic regression model was subsequently used to identify the optimal gene expression patterns of the investigated pathway. As a proof of concept, our approach yielded the natural product violacein at 525.4 mg/L in shake flasks, a 3.2-fold increase from the baseline strain. Violacein production was further increased to 1.31 g/L in a controlled benchtop bioreactor. We found that formulating discretized gene expression levels into logarithmic variables (Linlog transformation) was essential for implementing this DoE-based optimization procedure. The reported methodology can aid multivariate combinatorial pathway engineering and may be generalized as a standard procedure for accelerating strain engineering and improving metabolic pathway efficiency.
National scale multivariate extreme value modelling of waves, winds and sea levels
Directory of Open Access Journals (Sweden)
Gouldby Ben
2016-01-01
Full Text Available It has long been recognised that extreme coastal flooding can arise from the joint occurrence of extreme waves, winds and sea levels. The standard simplified joint probability approach used in England and Wales can result in an underestimation of flood risk unless correction factors are applied. This paper describes the application of a state-of-the-art multivariate extreme value model to offshore winds, waves and sea levels around the coast of England. The methodology overcomes the limitations of the traditional method. The output of the new statistical analysis is a Monte-Carlo (MC simulation comprising many thousands of offshore extreme events and it is necessary to translate all of these events into overtopping rates for use as input to flood risk assessments. It is computationally impractical to transform all of these MC events from the offshore to the nearshore. Computationally efficient statistical emulators of the SWAN wave transformation model have therefore been constructed. The emulators translate the thousands of MC events offshore. Whilst the methodology has been applied for national flood risk assessment, it has the potential to be implemented for wider use, including climate change impact assessment, nearshore wave climates for detailed local assessments and coastal flood forecasting.
Application of Multivariate Modeling for Radiation Injury Assessment: A Proof of Concept
Directory of Open Access Journals (Sweden)
David L. Bolduc
2014-01-01
Full Text Available Multivariate radiation injury estimation algorithms were formulated for estimating severe hematopoietic acute radiation syndrome (H-ARS injury (i.e., response category three or RC3 in a rhesus monkey total-body irradiation (TBI model. Classical CBC and serum chemistry blood parameters were examined prior to irradiation (d 0 and on d 7, 10, 14, 21, and 25 after irradiation involving 24 nonhuman primates (NHP (Macaca mulatta given 6.5-Gy 60Co Υ-rays (0.4 Gy min−1 TBI. A correlation matrix was formulated with the RC3 severity level designated as the “dependent variable” and independent variables down selected based on their radioresponsiveness and relatively low multicollinearity using stepwise-linear regression analyses. Final candidate independent variables included CBC counts (absolute number of neutrophils, lymphocytes, and platelets in formulating the “CBC” RC3 estimation algorithm. Additionally, the formulation of a diagnostic CBC and serum chemistry “CBC-SCHEM” RC3 algorithm expanded upon the CBC algorithm model with the addition of hematocrit and the serum enzyme levels of aspartate aminotransferase, creatine kinase, and lactate dehydrogenase. Both algorithms estimated RC3 with over 90% predictive power. Only the CBC-SCHEM RC3 algorithm, however, met the critical three assumptions of linear least squares demonstrating slightly greater precision for radiation injury estimation, but with significantly decreased prediction error indicating increased statistical robustness.
Review Content Analytics for the Prediction of Learner’s Feedback with Multivariate Regression Model
Directory of Open Access Journals (Sweden)
T. Chellatamilan
2015-06-01
Full Text Available E-learning facilitates both synchronous and asynchronous learning and it plays very important role in the teaching learning process. A large group of learners are engaged in the idea exchange independently by interacting with the members present in the learning management system. In order to generate meaningful learning outcome of the individual peer learners, the feedback review is very essential to extract the conceptual content which reflect the instantaneous learner’s behavior, emotions, capabilities, interestingness and difficulties and to fits them effectively. Collecting feedback in the form of numeric scale is very tough for both the learners and facilitators while specifying the rating, but it is too easy for the learners provide feedback in the form of text messages. The key challenge for analyzers is to extract the meaningful feedback content and dynamic rating of the learner’s feedback related to various conceptual contexts. We propose a novel method using multivariate predictive model for conceptual content analytics based on e-learners reviews using standard statistical model inverse regression. Finally the analysis is used in the prediction studies and to illustrate their effectiveness against the learner’s feedback.
Energy Technology Data Exchange (ETDEWEB)
Fouque, A.L.; Ciuciu, Ph.; Risser, L. [NeuroSpin/CEA, F-91191 Gif-sur-Yvette (France); Fouque, A.L.; Ciuciu, Ph.; Risser, L. [IFR 49, Institut d' Imagerie Neurofonctionnelle, Paris (France)
2009-07-01
In this paper, a novel statistical parcellation of intra-subject functional MRI (fMRI) data is proposed. The key idea is to identify functionally homogenous regions of interest from their hemodynamic parameters. To this end, a non-parametric voxel-based estimation of hemodynamic response function is performed as a prerequisite. Then, the extracted hemodynamic features are entered as the input data of a Multivariate Spatial Gaussian Mixture Model (MSGMM) to be fitted. The goal of the spatial aspect is to favor the recovery of connected components in the mixture. Our statistical clustering approach is original in the sense that it extends existing works done on univariate spatially regularized Gaussian mixtures. A specific Gibbs sampler is derived to account for different covariance structures in the feature space. On realistic artificial fMRI datasets, it is shown that our algorithm is helpful for identifying a parsimonious functional parcellation required in the context of joint detection estimation of brain activity. This allows us to overcome the classical assumption of spatial stationarity of the BOLD signal model. (authors)
Satellite image blind restoration based on surface fitting and multivariate model
Institute of Scientific and Technical Information of China (English)
CHEN Xin-bing; YANG Shi-zhi; WANG Xian-hua; QIAO Yan-li
2009-01-01
Owing to the blurring effect from atmosphere and camera system in the satellite imaging a blind image restoration algo-rithm is proposed which includes the modulation transfer function (MTF) estimation and the image restoration. In the MTF estimation stage, based on every degradation process of satellite imaging-chain, a combined parametric model of MTF is given and used to fit the surface of normalized logarithmic amplitude spectrum of degraded image. In the image restoration stage, a maximum a posteriori (MAP) based edge-preserving image restoration method is presented which introduces multivariate Laplacian model to characterize the prior distribution of wavelet coefficients of original image. During the image restoration, in order to avoid solving high nonlinear equations, optimization transfer algorithm is adopted to decom-pose the image restoration procedure into two simple steps: Landweber iteration and wavelet thresholding denoising. In the numerical experiment, the satellite image restoration results from SPOT-5 and high resolution camera (HR) of China & Brazil earth resource satellite (CBERS-02B) ane compared, and the proposed algorithm is superior in the image edge preservation and noise inhibition.
Tamborrino, Massimiliano; Sacerdote, Laura; Jacobsen, Martin
2014-11-01
We consider the multivariate point process determined by the crossing times of the components of a multivariate jump process through a multivariate boundary, assuming to reset each component to an initial value after its boundary crossing. We prove that this point process converges weakly to the point process determined by the crossing times of the limit process. This holds for both diffusion and deterministic limit processes. The almost sure convergence of the first passage times under the almost sure convergence of the processes is also proved. The particular case of a multivariate Stein process converging to a multivariate Ornstein-Uhlenbeck process is discussed as a guideline for applying diffusion limits for jump processes. We apply our theoretical findings to neural network modeling. The proposed model gives a mathematical foundation to the generalization of the class of Leaky Integrate-and-Fire models for single neural dynamics to the case of a firing network of neurons. This will help future study of dependent spike trains.
Advanced Mirror & Modelling Technology Development
Effinger, Michael; Stahl, H. Philip; Abplanalp, Laura; Maffett, Steven; Egerman, Robert; Eng, Ron; Arnold, William; Mosier, Gary; Blaurock, Carl
2014-01-01
The 2020 Decadal technology survey is starting in 2018. Technology on the shelf at that time will help guide selection to future low risk and low cost missions. The Advanced Mirror Technology Development (AMTD) team has identified development priorities based on science goals and engineering requirements for Ultraviolet Optical near-Infrared (UVOIR) missions in order to contribute to the selection process. One key development identified was lightweight mirror fabrication and testing. A monolithic, stacked, deep core mirror was fused and replicated twice to achieve the desired radius of curvature. It was subsequently successfully polished and tested. A recently awarded second phase to the AMTD project will develop larger mirrors to demonstrate the lateral scaling of the deep core mirror technology. Another key development was rapid modeling for the mirror. One model focused on generating optical and structural model results in minutes instead of months. Many variables could be accounted for regarding the core, face plate and back structure details. A portion of a spacecraft model was also developed. The spacecraft model incorporated direct integration to transform optical path difference to Point Spread Function (PSF) and between PSF to modulation transfer function. The second phase to the project will take the results of the rapid mirror modeler and integrate them into the rapid spacecraft modeler.
Technology Transfer Issues and a New Technology Transfer Model
Choi, Hee Jun
2009-01-01
The following are major issues that should be considered for efficient and effective technology transfer: conceptions of technology, technological activity and transfer, communication channels, factors affecting transfer, and models of transfer. In particular, a well-developed model of technology transfer could be used as a framework for…
Multivariate analysis with LISREL
Jöreskog, Karl G; Y Wallentin, Fan
2016-01-01
This book traces the theory and methodology of multivariate statistical analysis and shows how it can be conducted in practice using the LISREL computer program. It presents not only the typical uses of LISREL, such as confirmatory factor analysis and structural equation models, but also several other multivariate analysis topics, including regression (univariate, multivariate, censored, logistic, and probit), generalized linear models, multilevel analysis, and principal component analysis. It provides numerous examples from several disciplines and discusses and interprets the results, illustrated with sections of output from the LISREL program, in the context of the example. The book is intended for masters and PhD students and researchers in the social, behavioral, economic and many other sciences who require a basic understanding of multivariate statistical theory and methods for their analysis of multivariate data. It can also be used as a textbook on various topics of multivariate statistical analysis.
Lanting, Rosanne; Nooraee, Nazanin; Werker, Paul M N; van den Heuvel, Edwin R
2014-09-01
Dupuytren disease affects fingers in a variable fashion. Knowledge about specific disease patterns (phenotype) based on location and severity of the disease is lacking. In this cross-sectional study, 344 primary affected hands with Dupuytren disease were physically examined. The Pearson correlation coefficient between the coexistence of Dupuytren disease in pairs of fingers was calculated, and agglomerative hierarchical clustering was applied to identify possible clusters of affected fingers. With a multivariate ordinal logit model, the authors studied the correlation on severity, taking into account age and sex, and tested hypotheses on independence between groups of fingers. The ring finger was most frequently affected by Dupuytren disease, and contractures were seen in 15.1 percent of affected rays. The severity of thumb and index finger, middle and ring fingers, and middle and little fingers was significantly correlated. Occurrences in pairs of fingers were highest in the middle and ring fingers and lowest in the thumb and index finger. Correlation between the ring and little fingers and a correlation between fingers from the ulnar and radial sides could not be demonstrated. Rays on the ulnar side of the hand are predominantly affected. The middle finger is substantially correlated with other fingers on the ulnar side, and the thumb and index finger are correlated; however, there was no evidence that the ulnar side and the radial side were correlated in any way, which suggests that occurrence on one side of the hand does not predict Dupuytren disease on the other side of the hand. Risk, III.
A Bayesian design space for analytical methods based on multivariate models and predictions.
Lebrun, Pierre; Boulanger, Bruno; Debrus, Benjamin; Lambert, Philippe; Hubert, Philippe
2013-01-01
The International Conference for Harmonization (ICH) has released regulatory guidelines for pharmaceutical development. In the document ICH Q8, the design space of a process is presented as the set of factor settings providing satisfactory results. However, ICH Q8 does not propose any practical methodology to define, derive, and compute design space. In parallel, in the last decades, it has been observed that the diversity and the quality of analytical methods have evolved exponentially, allowing substantial gains in selectivity and sensitivity. However, there is still a lack of a rationale toward the development of robust separation methods in a systematic way. Applying ICH Q8 to analytical methods provides a methodology for predicting a region of the space of factors in which results will be reliable. Combining design of experiments and Bayesian standard multivariate regression, an identified form of the predictive distribution of a new response vector has been identified and used, under noninformative as well as informative prior distributions of the parameters. From the responses and their predictive distribution, various critical quality attributes can be easily derived. This Bayesian framework was then extended to the multicriteria setting to estimate the predictive probability that several critical quality attributes will be jointly achieved in the future use of an analytical method. An example based on a high-performance liquid chromatography (HPLC) method is given. For this example, a constrained sampling scheme was applied to ensure the modeled responses have desirable properties.
Siggiridou, Elsa; Kugiumtzis, Dimitris
2016-04-01
Granger causality has been used for the investigation of the inter-dependence structure of the underlying systems of multi-variate time series. In particular, the direct causal effects are commonly estimated by the conditional Granger causality index (CGCI). In the presence of many observed variables and relatively short time series, CGCI may fail because it is based on vector autoregressive models (VAR) involving a large number of coefficients to be estimated. In this work, the VAR is restricted by a scheme that modifies the recently developed method of backward-in-time selection (BTS) of the lagged variables and the CGCI is combined with BTS. Further, the proposed approach is compared favorably to other restricted VAR representations, such as the top-down strategy, the bottom-up strategy, and the least absolute shrinkage and selection operator (LASSO), in terms of sensitivity and specificity of CGCI. This is shown by using simulations of linear and nonlinear, low and high-dimensional systems and different time series lengths. For nonlinear systems, CGCI from the restricted VAR representations are compared with analogous nonlinear causality indices. Further, CGCI in conjunction with BTS and other restricted VAR representations is applied to multi-channel scalp electroencephalogram (EEG) recordings of epileptic patients containing epileptiform discharges. CGCI on the restricted VAR, and BTS in particular, could track the changes in brain connectivity before, during and after epileptiform discharges, which was not possible using the full VAR representation.
Siggiridou, Elsa
2015-01-01
Granger causality has been used for the investigation of the inter-dependence structure of the underlying systems of multi-variate time series. In particular, the direct causal effects are commonly estimated by the conditional Granger causality index (CGCI). In the presence of many observed variables and relatively short time series, CGCI may fail because it is based on vector autoregressive models (VAR) involving a large number of coefficients to be estimated. In this work, the VAR is restricted by a scheme that modifies the recently developed method of backward-in-time selection (BTS) of the lagged variables and the CGCI is combined with BTS. Further, the proposed approach is compared favorably to other restricted VAR representations, such as the top-down strategy, the bottom-up strategy, and the least absolute shrinkage and selection operator (LASSO), in terms of sensitivity and specificity of CGCI. This is shown by using simulations of linear and nonlinear, low and high-dimensional systems and different t...
A Multivariate Fit Luminosity Function and World Model for Long GRBs
Shahmoradi, Amir
2012-01-01
It is proposed that the luminosity function, the comoving-frame spectral correlations and distributions of cosmological Long-duration Gamma-Ray Bursts (LGRBs) may be very well described as multivariate log-normal distribution. This result is based on careful selection, analysis and modeling of the spectral parameters of LGRBs in the largest catalog of Gamma-Ray Bursts available to date: 2130 BATSE GRBs, while taking into account the detection threshold and possible selection effects on observational data. Constraints on the joint quadru-variate distribution of the isotropic peak luminosity, the total isotropic emission, the comoving-frame time-integrated spectral peak energy and the comoving-frame duration of LGRBs are derived. Extensive goodness-of-fit tests are performed. The presented analysis provides evidence for a relatively large fraction of LGRBs that have been missed by BATSE detector with total isotropic emissions extending down to 10^49 [erg] and observed spectral peak energies as low as 5 [KeV]. T...
Directory of Open Access Journals (Sweden)
Youxin Luo
2013-07-01
Full Text Available Grey system theory is a scientific theory possessed with wide adaptability to study poor information. The construction method of the background value in multivariable grey model was analyzed. The trapezoid formula and extrapolation method using rational interpolation and numerical integration was proposed based on the theory of vector valued continued fractions. And a non-equidistant multivariable grey model MGRM(1,n was built through applying reciprocal accumulated generating operation. The model is suitable for building both equidistant and non-equidistant models, and it broadens the application range of the grey model and effectively increases both the fitting and the prediction precisions of the model. The applicability and the reliability of the model built were proven by real cases.
Cole-Cole, linear and multivariate modeling of capacitance data for on-line monitoring of biomass.
Dabros, Michal; Dennewald, Danielle; Currie, David J; Lee, Mark H; Todd, Robert W; Marison, Ian W; von Stockar, Urs
2009-02-01
This work evaluates three techniques of calibrating capacitance (dielectric) spectrometers used for on-line monitoring of biomass: modeling of cell properties using the theoretical Cole-Cole equation, linear regression of dual-frequency capacitance measurements on biomass concentration, and multivariate (PLS) modeling of scanning dielectric spectra. The performance and robustness of each technique is assessed during a sequence of validation batches in two experimental settings of differing signal noise. In more noisy conditions, the Cole-Cole model had significantly higher biomass concentration prediction errors than the linear and multivariate models. The PLS model was the most robust in handling signal noise. In less noisy conditions, the three models performed similarly. Estimates of the mean cell size were done additionally using the Cole-Cole and PLS models, the latter technique giving more satisfactory results.
Multivariate calibration modeling of liver oxygen saturation using near-infrared spectroscopy
Cingo, Ndumiso A.; Soller, Babs R.; Puyana, Juan C.
2000-05-01
The liver has been identified as an ideal site to spectroscopically monitor for changes in oxygen saturation during liver transplantation and shock because it is susceptible to reduced blood flow and oxygen transport. Near-IR spectroscopy, combined with multivariate calibration techniques, has been shown to be a viable technique for monitoring oxygen saturation changes in various organs in a minimally invasive manner. The liver has a dual system circulation. Blood enters the liver through the portal vein and hepatic artery, and leaves through the hepatic vein. Therefore, it is of utmost importance to determine how the liver NIR spectroscopic information correlates with the different regions of the hepatic lobule as the dual circulation flows from the presinusoidal space into the post sinusoidal region of the central vein. For NIR spectroscopic information to reliably represent the status of liver oxygenation, the NIR oxygen saturation should best correlate with the post-sinusoidal region. In a series of six pigs undergoing induced hemorrhagic chock, NIR spectra collected from the liver were used together with oxygen saturation reference data from the hepatic and portal veins, and an average of the two to build partial least-squares regression models. Results obtained from these models show that the hepatic vein and an average of the hepatic and portal veins provide information that is best correlate with NIR spectral information, while the portal vein reference measurement provides poorer correlation and accuracy. These results indicate that NIR determination of oxygen saturation in the liver can provide an assessment of liver oxygen utilization.
Institute of Scientific and Technical Information of China (English)
Wengang Zhang; Anthony T.C. Goh
2016-01-01
Piles are long, slender structural elements used to transfer the loads from the superstructure through weak strata onto stiffer soils or rocks. For driven piles, the impact of the piling hammer induces compression and tension stresses in the piles. Hence, an important design consideration is to check that the strength of the pile is sufficient to resist the stresses caused by the impact of the pile hammer. Due to its complexity, pile drivability lacks a precise analytical solution with regard to the phenomena involved. In situations where measured data or numerical hypothetical results are available, neural networks stand out in mapping the nonlinear interactions and relationships between the system’s predictors and dependent responses. In addition, unlike most computational tools, no mathematical relationship assumption between the dependent and independent variables has to be made. Nevertheless, neural networks have been criticized for their long trial-and-error training process since the optimal configu-ration is not known a priori. This paper investigates the use of a fairly simple nonparametric regression algorithm known as multivariate adaptive regression splines (MARS), as an alternative to neural net-works, to approximate the relationship between the inputs and dependent response, and to mathe-matically interpret the relationship between the various parameters. In this paper, the Back propagation neural network (BPNN) and MARS models are developed for assessing pile drivability in relation to the prediction of the Maximum compressive stresses (MCS), Maximum tensile stresses (MTS), and Blow per foot (BPF). A database of more than four thousand piles is utilized for model development and comparative performance between BPNN and MARS predictions.
Multivariate Modelling of the Canary Islands Banana Output. The Role of Farmer Income Expectation
Directory of Open Access Journals (Sweden)
Concepción González-Concepción
2008-01-01
Full Text Available The EU is the worlds largest importer of bananas and the only major managed market in the international banana trade. Spain is the main banana producer within the European Union (EU, followed by France and Portugal. In all these countries the fruit is grown in overseas islands situated in tropical or sub-tropical areas and bananas are a pillar of the economic, social and environmental balance of these regions. Spanish production comes from the Canary Islands, an insular environment located in the Atlantic Ocean more than 1000 km south of the Iberian Peninsula and near the northwest coast of Africa. In the context of high production costs and strong competition from Latin American imports, the compensatory aid that local farmers have been receiving from the EU since 1993 has helped the archipelago to maintain its agricultural position while constituting a main support from an economic, social and landscaping standpoint. This research analyses the Canary Islands banana output evolution through the use of certain multivariate dynamic models that consider the influence of past production costs, past farmer income and future expectations, including a sensitivity analysis. We consider annual data time series on production, perceived prices and production costs for the period 1938-2002. Model predictions are contrasted using data for the period 2003-2006, thus spanning a wide period of time that includes key points such as the 1993 reform and the introduction of the 2006 reform. The empirical work highlights, as do all EU norms, the importance of maintaining adequate farmer income expectations to assure subsistence banana production.
Mehdizadeh, Saeid; Behmanesh, Javad; Khalili, Keivan
2017-07-01
Soil temperature (T s) and its thermal regime are the most important factors in plant growth, biological activities, and water movement in soil. Due to scarcity of the T s data, estimation of soil temperature is an important issue in different fields of sciences. The main objective of the present study is to investigate the accuracy of multivariate adaptive regression splines (MARS) and support vector machine (SVM) methods for estimating the T s. For this aim, the monthly mean data of the T s (at depths of 5, 10, 50, and 100 cm) and meteorological parameters of 30 synoptic stations in Iran were utilized. To develop the MARS and SVM models, various combinations of minimum, maximum, and mean air temperatures (T min, T max, T); actual and maximum possible sunshine duration; sunshine duration ratio (n, N, n/N); actual, net, and extraterrestrial solar radiation data (R s, R n, R a); precipitation (P); relative humidity (RH); wind speed at 2 m height (u 2); and water vapor pressure (Vp) were used as input variables. Three error statistics including root-mean-square-error (RMSE), mean absolute error (MAE), and determination coefficient (R 2) were used to check the performance of MARS and SVM models. The results indicated that the MARS was superior to the SVM at different depths. In the test and validation phases, the most accurate estimations for the MARS were obtained at the depth of 10 cm for T max, T min, T inputs (RMSE = 0.71 °C, MAE = 0.54 °C, and R 2 = 0.995) and for RH, V p, P, and u 2 inputs (RMSE = 0.80 °C, MAE = 0.61 °C, and R 2 = 0.996), respectively.
Forghani, Ali; Peralta, Richard C.
2017-10-01
The study presents a procedure using solute transport and statistical models to evaluate the performance of aquifer storage and recovery (ASR) systems designed to earn additional water rights in freshwater aquifers. The recovery effectiveness (REN) index quantifies the performance of these ASR systems. REN is the proportion of the injected water that the same ASR well can recapture during subsequent extraction periods. To estimate REN for individual ASR wells, the presented procedure uses finely discretized groundwater flow and contaminant transport modeling. Then, the procedure uses multivariate adaptive regression splines (MARS) analysis to identify the significant variables affecting REN, and to identify the most recovery-effective wells. Achieving REN values close to 100% is the desire of the studied 14-well ASR system operator. This recovery is feasible for most of the ASR wells by extracting three times the injectate volume during the same year as injection. Most of the wells would achieve RENs below 75% if extracting merely the same volume as they injected. In other words, recovering almost all the same water molecules that are injected requires having a pre-existing water right to extract groundwater annually. MARS shows that REN most significantly correlates with groundwater flow velocity, or hydraulic conductivity and hydraulic gradient. MARS results also demonstrate that maximizing REN requires utilizing the wells located in areas with background Darcian groundwater velocities less than 0.03 m/d. The study also highlights the superiority of MARS over regular multiple linear regressions to identify the wells that can provide the maximum REN. This is the first reported application of MARS for evaluating performance of an ASR system in fresh water aquifers.
Bioprinting technologies for disease modeling
DEFF Research Database (Denmark)
Memic, Adnan; Navaei, Ali; Mirani, Bahram
2017-01-01
challenges of conventional in vitro assays through the development of custom bioinks and patient derived cells coupled with well-defined arrangements of biomaterials. Here, we provide an overview on the technological aspects of 3D bioprinting technique and discuss how the development of bioprinted tissue......There is a great need for the development of biomimetic human tissue models that allow elucidation of the pathophysiological conditions involved in disease initiation and progression. Conventional two-dimensional (2D) in vitro assays and animal models have been unable to fully recapitulate...... the critical characteristics of human physiology. Alternatively, three-dimensional (3D) tissue models are often developed in a low-throughput manner and lack crucial native-like architecture. The recent emergence of bioprinting technologies has enabled creating 3D tissue models that address the critical...
Torabi, Mahmoud
2016-09-01
Disease mapping of a single disease has been widely studied in the public health setup. Simultaneous modeling of related diseases can also be a valuable tool both from the epidemiological and from the statistical point of view. In particular, when we have several measurements recorded at each spatial location, we need to consider multivariate models in order to handle the dependence among the multivariate components as well as the spatial dependence between locations. It is then customary to use multivariate spatial models assuming the same distribution through the entire population density. However, in many circumstances, it is a very strong assumption to have the same distribution for all the areas of population density. To overcome this issue, we propose a hierarchical multivariate mixture generalized linear model to simultaneously analyze spatial Normal and non-Normal outcomes. As an application of our proposed approach, esophageal and lung cancer deaths in Minnesota are used to show the outperformance of assuming different distributions for different counties of Minnesota rather than assuming a single distribution for the population density. Performance of the proposed approach is also evaluated through a simulation study. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Rientjes, T.H.M.; Muthuwatta, L.P.; Bos, M.G.; Booij, M.J.; Bhatti, H.A.
2013-01-01
In this study, streamflow (Qs) and satellite-based actual evapotranspiration (ETa) are used in a multi-variable calibration framework to reproduce the catchment water balance. The application is for the HBV rainfall–runoff model at daily time-step for the Karkheh River Basin (51,000 km2) in Iran. Mo
K.I.E. Snell (Kym I.E.); H. Hua (Harry); T.P. Debray (Thomas P.A.); J. Ensor (Joie); M.P. Look (Maxime); K.G.M. Moons (Karel G.M.); R.D. Riley (Richard D.)
2016-01-01
textabstractObjectives Our aim was to improve meta-analysis methods for summarizing a prediction model's performance when individual participant data are available from multiple studies for external validation. Study Design and Setting We suggest multivariate meta-analysis for jointly synthesizing c
Directory of Open Access Journals (Sweden)
Michael Friendly
2006-11-01
Full Text Available This paper describes graphical methods for multiple-response data within the framework of the multivariate linear model (MLM, aimed at understanding what is being tested in a multivariate test, and how factor/predictor effects are expressed across multiple response measures. In particular, we describe and illustrate a collection of SAS macro programs for: (a Data ellipses and low-rank biplots for multivariate data, (b HE plots, showing the hypothesis and error covariance matrices for a given pair of responses, and a given effect, (c HE plot matrices, showing all pairwise HE plots, and (d low-rank analogs of HE plots, showing all observations, group means, and their relations to the response variables.
Towards a Multi-Variable Parametric Cost Model for Ground and Space Telescopes
Stahl, H. Philip; Henrichs, Todd
2016-01-01
Parametric cost models can be used by designers and project managers to perform relative cost comparisons between major architectural cost drivers and allow high-level design trades; enable cost-benefit analysis for technology development investment; and, provide a basis for estimating total project cost between related concepts. This paper hypothesizes a single model, based on published models and engineering intuition, for both ground and space telescopes: OTA Cost approximately (X) D(exp (1.75 +/- 0.05)) lambda(exp(-0.5 +/- 0.25) T(exp -0.25) e (exp (-0.04)Y). Specific findings include: space telescopes cost 50X to 100X more ground telescopes; diameter is the most important CER; cost is reduced by approximately 50% every 20 years (presumably because of technology advance and process improvements); and, for space telescopes, cost associated with wavelength performance is balanced by cost associated with operating temperature. Finally, duplication only reduces cost for the manufacture of identical systems (i.e. multiple aperture sparse arrays or interferometers). And, while duplication does reduce the cost of manufacturing the mirrors of segmented primary mirror, this cost savings does not appear to manifest itself in the final primary mirror assembly (presumably because the structure for a segmented mirror is more complicated than for a monolithic mirror).
Revealing the Organization of Complex Adaptive Systems through Multivariate Time Series Modeling
Directory of Open Access Journals (Sweden)
David G. Angeler
2011-09-01
Full Text Available Revealing the adaptive responses of ecological, social, and economic systems to a transforming biosphere is crucial for understanding system resilience and preventing collapse. However, testing the theory that underpins complex adaptive system organization (e.g., panarchy theory is challenging. We used multivariate time series modeling to identify scale-specific system organization and, by extension, apparent resilience mechanisms. We used a 20-year time series of invertebrates and phytoplankton from 26 Swedish lakes to test the proposition that a few key-structuring environmental variables at specific scales create discontinuities in community dynamics. Cross-scale structure was manifested in two independent species groups within both communities across lakes. The first species group showed patterns of directional temporal change, which was related to environmental variables that acted at broad spatiotemporal scales (reduced sulfate deposition, North Atlantic Oscillation. The second species group showed fluctuation patterns, which often could not be explained by environmental variables. However, when significant relationships were found, species-group trends were predicted by variables (total organic carbon, nutrients that acted at narrower spatial scales (i.e., catchment and lake. Although the sets of environmental variables that predicted the species groups differed between phytoplankton and invertebrates, the scale-specific imprints of keystone environmental variables for creating cross-scale structure were clear for both communities. Temporal trends of functional groups did not track the observed structural changes, suggesting functional stability despite structural change. Our approach allows for identifying scale-specific patterns and processes, thus providing opportunities for better characterization of complex adaptive systems organization and dynamics. This, in turn, holds potential for more accurate evaluation of resilience in
Snell, Kym I E; Hua, Harry; Debray, Thomas P A; Ensor, Joie; Look, Maxime P; Moons, Karel G M; Riley, Richard D
2016-01-01
Our aim was to improve meta-analysis methods for summarizing a prediction model's performance when individual participant data are available from multiple studies for external validation. We suggest multivariate meta-analysis for jointly synthesizing calibration and discrimination performance, while accounting for their correlation. The approach estimates a prediction model's average performance, the heterogeneity in performance across populations, and the probability of "good" performance in new populations. This allows different implementation strategies (e.g., recalibration) to be compared. Application is made to a diagnostic model for deep vein thrombosis (DVT) and a prognostic model for breast cancer mortality. In both examples, multivariate meta-analysis reveals that calibration performance is excellent on average but highly heterogeneous across populations unless the model's intercept (baseline hazard) is recalibrated. For the cancer model, the probability of "good" performance (defined by C statistic ≥0.7 and calibration slope between 0.9 and 1.1) in a new population was 0.67 with recalibration but 0.22 without recalibration. For the DVT model, even with recalibration, there was only a 0.03 probability of "good" performance. Multivariate meta-analysis can be used to externally validate a prediction model's calibration and discrimination performance across multiple populations and to evaluate different implementation strategies. Crown Copyright © 2016. Published by Elsevier Inc. All rights reserved.
DEFF Research Database (Denmark)
Kallevik, H.; Hansen, Susanne Brunsgaard; Sæther, Ø.
2000-01-01
Water-in-oil emulsions are investigated by means of multivariate analysis of near infrared (NIR) spectroscopic profiles in the range 1100 - 2250 nm. The oil phase is a paraffin-diluted crude oil from the Norwegian Continental Shelf. The influence of water absorption and light scattering of the wa......Water-in-oil emulsions are investigated by means of multivariate analysis of near infrared (NIR) spectroscopic profiles in the range 1100 - 2250 nm. The oil phase is a paraffin-diluted crude oil from the Norwegian Continental Shelf. The influence of water absorption and light scattering...
Multiple-model-and-neural-network-based nonlinear multivariable adaptive control
Institute of Scientific and Technical Information of China (English)
Yue FU; Tianyou CHAI
2007-01-01
A multivariable adaptive controller feasible for implementation on distributed computer systems (DCS) is presented for a class of uncertain nonlinear multivariable discrete time systems. The adaptive controller is composed of a linear adaptive controller, a neural network nonlinear adaptive controller and a switching mechanism. The linear controller can provide boundedness of the input and output signals, and the nonlinear controller can improve the performance of the system. The purpose of using the switching mechanism is to obtain the improved system performance and stability simultaneously. Theory analysis and simulation results are presented to show the effectiveness of the proposed method.
Directory of Open Access Journals (Sweden)
Fang-Rong Yan
Full Text Available This article provides a fully bayesian approach for modeling of single-dose and complete pharmacokinetic data in a population pharmacokinetic (PK model. To overcome the impact of outliers and the difficulty of computation, a generalized linear model is chosen with the hypothesis that the errors follow a multivariate Student t distribution which is a heavy-tailed distribution. The aim of this study is to investigate and implement the performance of the multivariate t distribution to analyze population pharmacokinetic data. Bayesian predictive inferences and the Metropolis-Hastings algorithm schemes are used to process the intractable posterior integration. The precision and accuracy of the proposed model are illustrated by the simulating data and a real example of theophylline data.
Bilgel, Murat; Prince, Jerry L; Wong, Dean F; Resnick, Susan M; Jedynak, Bruno M
2016-07-01
It is important to characterize the temporal trajectories of disease-related biomarkers in order to monitor progression and identify potential points of intervention. These are especially important for neurodegenerative diseases, as therapeutic intervention is most likely to be effective in the preclinical disease stages prior to significant neuronal damage. Neuroimaging allows for the measurement of structural, functional, and metabolic integrity of the brain at the level of voxels, whose volumes are on the order of mm(3). These voxelwise measurements provide a rich collection of disease indicators. Longitudinal neuroimaging studies enable the analysis of changes in these voxelwise measures. However, commonly used longitudinal analysis approaches, such as linear mixed effects models, do not account for the fact that individuals enter a study at various disease stages and progress at different rates, and generally consider each voxelwise measure independently. We propose a multivariate nonlinear mixed effects model for estimating the trajectories of voxelwise neuroimaging biomarkers from longitudinal data that accounts for such differences across individuals. The method involves the prediction of a progression score for each visit based on a collective analysis of voxelwise biomarker data within an expectation-maximization framework that efficiently handles large amounts of measurements and variable number of visits per individual, and accounts for spatial correlations among voxels. This score allows individuals with similar progressions to be aligned and analyzed together, which enables the construction of a trajectory of brain changes as a function of an underlying progression or disease stage. We apply our method to studying cortical β-amyloid deposition, a hallmark of preclinical Alzheimer's disease, as measured using positron emission tomography. Results on 104 individuals with a total of 300 visits suggest that precuneus is the earliest cortical region to
A MULTIVARIATE FIT LUMINOSITY FUNCTION AND WORLD MODEL FOR LONG GAMMA-RAY BURSTS
Energy Technology Data Exchange (ETDEWEB)
Shahmoradi, Amir, E-mail: amir@physics.utexas.edu [Institute for Fusion Studies, The University of Texas at Austin, TX 78712 (United States)
2013-04-01
It is proposed that the luminosity function, the rest-frame spectral correlations, and distributions of cosmological long-duration (Type-II) gamma-ray bursts (LGRBs) may be very well described as a multivariate log-normal distribution. This result is based on careful selection, analysis, and modeling of LGRBs' temporal and spectral variables in the largest catalog of GRBs available to date: 2130 BATSE GRBs, while taking into account the detection threshold and possible selection effects. Constraints on the joint rest-frame distribution of the isotropic peak luminosity (L{sub iso}), total isotropic emission (E{sub iso}), the time-integrated spectral peak energy (E{sub p,z}), and duration (T{sub 90,z}) of LGRBs are derived. The presented analysis provides evidence for a relatively large fraction of LGRBs that have been missed by the BATSE detector with E{sub iso} extending down to {approx}10{sup 49} erg and observed spectral peak energies (E{sub p} ) as low as {approx}5 keV. LGRBs with rest-frame duration T{sub 90,z} {approx}< 1 s or observer-frame duration T{sub 90} {approx}< 2 s appear to be rare events ({approx}< 0.1% chance of occurrence). The model predicts a fairly strong but highly significant correlation ({rho} = 0.58 {+-} 0.04) between E{sub iso} and E{sub p,z} of LGRBs. Also predicted are strong correlations of L{sub iso} and E{sub iso} with T{sub 90,z} and moderate correlation between L{sub iso} and E{sub p,z}. The strength and significance of the correlations found encourage the search for underlying mechanisms, though undermine their capabilities as probes of dark energy's equation of state at high redshifts. The presented analysis favors-but does not necessitate-a cosmic rate for BATSE LGRBs tracing metallicity evolution consistent with a cutoff Z/Z{sub Sun} {approx} 0.2-0.5, assuming no luminosity-redshift evolution.
Ringham, Brandy M; Kreidler, Sarah M; Muller, Keith E; Glueck, Deborah H
2016-07-30
Multilevel and longitudinal studies are frequently subject to missing data. For example, biomarker studies for oral cancer may involve multiple assays for each participant. Assays may fail, resulting in missing data values that can be assumed to be missing completely at random. Catellier and Muller proposed a data analytic technique to account for data missing at random in multilevel and longitudinal studies. They suggested modifying the degrees of freedom for both the Hotelling-Lawley trace F statistic and its null case reference distribution. We propose parallel adjustments to approximate power for this multivariate test in studies with missing data. The power approximations use a modified non-central F statistic, which is a function of (i) the expected number of complete cases, (ii) the expected number of non-missing pairs of responses, or (iii) the trimmed sample size, which is the planned sample size reduced by the anticipated proportion of missing data. The accuracy of the method is assessed by comparing the theoretical results to the Monte Carlo simulated power for the Catellier and Muller multivariate test. Over all experimental conditions, the closest approximation to the empirical power of the Catellier and Muller multivariate test is obtained by adjusting power calculations with the expected number of complete cases. The utility of the method is demonstrated with a multivariate power analysis for a hypothetical oral cancer biomarkers study. We describe how to implement the method using standard, commercially available software products and give example code. Copyright © 2015 John Wiley & Sons, Ltd.
Wheat grain attributes that influence tortilla quality are not fully understood. This impedes genetic improvement efforts to develop wheat varieties for the growing market. This study used a multivariate discriminant analysis to predict tortilla quality using a set of 16 variables derived from kerne...
Wu, Zhisheng; Sui, Chenglin; Xu, Bing; Ai, Lu; Ma, Qun; Shi, Xinyuan; Qiao, Yanjiang
2013-04-15
A methodology is proposed to estimate the multivariate detection limits (MDL) of on-line near-infrared (NIR) model in Chinese Herbal Medicines (CHM) system. In this paper, Lonicera japonica was used as an example, and its extraction process was monitored by on-line NIR spectroscopy. Spectra of on-line NIR could be collected by two fiber optic probes designed to transmit NIR radiation by a 2mm-flange. High performance liquid chromatography (HPLC) was used as a reference method to determine the content of chlorogenic acid in the extract solution. Multivariate calibration models were carried out including partial least squares regression (PLS) and interval partial least-squares (iPLS). The result showed improvement of model performance: compared with PLS model, the root mean square errors of prediction (RMSEP) of iPLS model decreased from 0.111mg to 0.068mg, and the R(2) parameter increased from 0.9434 to 0.9801. Furthermore, MDL values were determined by a multivariate method using the type of errors and concentration ranges. The MDL of iPLS model was about 14ppm, which confirmed that on-line NIR spectroscopy had the ability to detect trace amounts of chlorogenic acid in L. japonica. As a result, the application of on-line NIR spectroscopy for monitoring extraction process in CHM could be very encouraging and reliable.
Institute of Scientific and Technical Information of China (English)
Katsuaki Koike
2011-01-01
Sample data in the Earth and environmental sciences are limited in quantity and sampling location and therefore, sophisticated spatial modeling techniques are indispensable for accurate imaging of complicated structures and properties of geomaterials. This paper presents several effective methods that are grouped into two categories depending on the nature of regionalized data used. Type I data originate from plural populations and type II data satisfy the prerequisite of stationarity and have distinct spatial correlations. For the type I data, three methods are shown to be effective and demonstrated to produce plausible results: (1) a spline-based method, (2) a combination of a spline-based method with a stochastic simulation, and (3) a neural network method. Geostatistics proves to be a powerful tool for type II data. Three new approaches of geostatistics are presented with case studies: an application to directional data such as fracture, multi-scale modeling that incorporates a scaling law,and space-time joint analysis for multivariate data. Methods for improving the contribution of such spatial modeling to Earth and environmental sciences are also discussed and future important problems to be solved are summarized.
Real, Jordi; Forné, Carles; Roso-Llorach, Albert; Martínez-Sánchez, Jose M
2016-05-01
Controlling for confounders is a crucial step in analytical observational studies, and multivariable models are widely used as statistical adjustment techniques. However, the validation of the assumptions of the multivariable regression models (MRMs) should be made clear in scientific reporting. The objective of this study is to review the quality of statistical reporting of the most commonly used MRMs (logistic, linear, and Cox regression) that were applied in analytical observational studies published between 2003 and 2014 by journals indexed in MEDLINE.Review of a representative sample of articles indexed in MEDLINE (n = 428) with observational design and use of MRMs (logistic, linear, and Cox regression). We assessed the quality of reporting about: model assumptions and goodness-of-fit, interactions, sensitivity analysis, crude and adjusted effect estimate, and specification of more than 1 adjusted model.The tests of underlying assumptions or goodness-of-fit of the MRMs used were described in 26.2% (95% CI: 22.0-30.3) of the articles and 18.5% (95% CI: 14.8-22.1) reported the interaction analysis. Reporting of all items assessed was higher in articles published in journals with a higher impact factor.A low percentage of articles indexed in MEDLINE that used multivariable techniques provided information demonstrating rigorous application of the model selected as an adjustment method. Given the importance of these methods to the final results and conclusions of observational studies, greater rigor is required in reporting the use of MRMs in the scientific literature.
Wheelock, Åsa M; Wheelock, Craig E
2013-11-01
Respiratory diseases are multifactorial heterogeneous diseases that have proved recalcitrant to understanding using focused molecular techniques. This trend has led to the rise of 'omics approaches (e.g., transcriptomics, proteomics) and subsequent acquisition of large-scale datasets consisting of multiple variables. In 'omics technology-based investigations, discrepancies between the number of variables analyzed (e.g., mRNA, proteins, metabolites) and the number of study subjects constitutes a major statistical challenge. The application of traditional univariate statistical methods (e.g., t-test) to these "short-and-wide" datasets may result in high numbers of false positives, while the predominant approach of p-value correction to account for these high false positive rates (e.g., FDR, Bonferroni) are associated with significant losses in statistical power. In other words, the benefit in decreased false positives must be counterbalanced with a concomitant loss in true positives. As an alternative, multivariate statistical analysis (MVA) is increasingly being employed to cope with 'omics-based data structures. When properly applied, MVA approaches can be powerful tools for integration and interpretation of complex 'omics-based datasets towards the goal of identifying biomarkers and/or subphenotypes. However, MVA methods are also prone to over-interpretation and misuse. A common software used in biomedical research to perform MVA-based analyses is the SIMCA package, which includes multiple MVA methods. In this opinion piece, we propose guidelines for minimum reporting standards for a SIMCA-based workflow, in terms of data preprocessing (e.g., normalization, scaling) and model statistics (number of components, R2, Q2, and CV-ANOVA p-value). Examples of these applications in recent COPD and asthma studies are provided. It is expected that readers will gain an increased understanding of the power and utility of MVA methods for applications in biomedical research.
Quality-by-Design: Multivariate Model for Multicomponent Quantification in Refining Process of Honey
Li, Xiaoying; Wu, Zhisheng; Feng, Xin; Liu, Shanshan; Yu, Xiaojie; Ma, Qun; Qiao, Yanjiang
2017-01-01
Objective: A method for rapid analysis of the refining process of honey was developed based on near-infrared (NIR) spectroscopy. Methods: Partial least square calibration models were built for the four components after the selection of the optimal spectral pretreatment method and latent factors. Results: The models covered the samples of different temperatures and time points, therefore the models were robust and universal. Conclusions: These results highlighted that the NIR technology could extract the information of critical process and provide essential process knowledge of the honey refining process. SUMMARY A method for rapid analysis of the refining process of honey was developed based on near-infrared (NIR) spectroscopy. Abbreviation used: NIR: Near-infrared; 5-HMF: 5-hydroxymethylfurfural; RMSEP: Root mean square error of prediction; R: correlation coefficients; PRESS: prediction residual error-sum squares; TCM: Traditional Chinese medicine; HPLC: High-performance liquid chromatography; HPLC-DAD: HPLC-diode array detector; PLS: Partial least square; MSC: multiplicative scatter correction; RMSECV: Root mean square error of cross validation; RPD: Residual predictive deviation; 1D: 1st order derivative; SG: Savitzky-Golay smooth; 2D: 2nd order derivative. PMID:28216906
Zouheir Mighri; Faysal Mansouri
2014-01-01
The aim of this article is to examine how the dynamics of correlations between two emerging countries (Brazil and Mexico) and the US evolved from January 2003 to December 2013. The main contribution of this study is to explore whether the plunging stock market in the US, in the aftermath of global financial crisis (2007 - 2009), exerts contagion effects on emerging stock markets. To this end, we rely on a multivariate fractionally integrated asymmetric power autoregressive conditional heteros...
DEFF Research Database (Denmark)
Kallevik, H.; Hansen, Susanne Brunsgaard; Sæther, Ø.
2000-01-01
Water-in-oil emulsions are investigated by means of multivariate analysis of near infrared (NIR) spectroscopic profiles in the range 1100 - 2250 nm. The oil phase is a paraffin-diluted crude oil from the Norwegian Continental Shelf. The influence of water absorption and light scattering...... of the water droplets are shown to be strong. Despite the strong influence of the water phase, the NIR technique is still capable of predicting the composition of the investigated oil phase....
Ba, Demba; Temereanca, Simona; Brown, Emery N
2014-01-01
Understanding how ensembles of neurons represent and transmit information in the patterns of their joint spiking activity is a fundamental question in computational neuroscience. At present, analyses of spiking activity from neuronal ensembles are limited because multivariate point process (MPP) models cannot represent simultaneous occurrences of spike events at an arbitrarily small time resolution. Solo recently reported a simultaneous-event multivariate point process (SEMPP) model to correct this key limitation. In this paper, we show how Solo's discrete-time formulation of the SEMPP model can be efficiently fit to ensemble neural spiking activity using a multinomial generalized linear model (mGLM). Unlike existing approximate procedures for fitting the discrete-time SEMPP model, the mGLM is an exact algorithm. The MPP time-rescaling theorem can be used to assess model goodness-of-fit. We also derive a new marked point-process (MkPP) representation of the SEMPP model that leads to new thinning and time-rescaling algorithms for simulating an SEMPP stochastic process. These algorithms are much simpler than multivariate extensions of algorithms for simulating a univariate point process, and could not be arrived at without the MkPP representation. We illustrate the versatility of the SEMPP model by analyzing neural spiking activity from pairs of simultaneously-recorded rat thalamic neurons stimulated by periodic whisker deflections, and by simulating SEMPP data. In the data analysis example, the SEMPP model demonstrates that whisker motion significantly modulates simultaneous spiking activity at the 1 ms time scale and that the stimulus effect is more than one order of magnitude greater for simultaneous activity compared with non-simultaneous activity. Together, the mGLM, the MPP time-rescaling theorem and the MkPP representation of the SEMPP model offer a theoretically sound, practical tool for measuring joint spiking propensity in a neuronal ensemble.
Guiding healthcare technology implementation: a new integrated technology implementation model.
Schoville, Rhonda R; Titler, Marita G
2015-03-01
Healthcare technology is used to improve delivery of safe patient care by providing tools for early diagnosis, ongoing monitoring, and treatment of patients. This technology includes bedside physiologic monitors, pulse oximetry devices, electrocardiogram machines, bedside telemetry, infusion pumps, ventilators, and electronic health records. Healthcare costs are a challenge for society, and hospitals are pushed to lower costs by discharging patients sooner. Healthcare technology is being used to facilitate these early discharges. There is little understanding of how healthcare facilities purchase, implement, and adopt technology. There are two areas of theories and models currently used when investigating technology: technology adoption and implementation science. Technology adoption focuses mainly on how the end users adopt technology, whereas implementation science describes methods, interventions, and variables that promote the use of evidence-based practice. These two approaches are not well informed by each other. In addition, amplifying the knowledge gap is the limited conceptualization of healthcare technology implementation frameworks. To bridge this gap, an all-encompassing model is needed. To understand the key technology implementation factors utilized by leading healthcare facilities, the prevailing technology adoption and implementation science theories and models were reviewed. From this review, an integrated technology implementation model will be set forth.
Kramer, Diether; Veeranki, Sai; Hayn, Dieter; Quehenberger, Franz; Leodolter, Werner; Jagsch, Christian; Schreier, Günter
2017-01-01
Delirium is an acute confusion condition, which is common in elderly and often misdiagnosed in hospitalized patients. Early identification and prevention of delirium could reduce morbidity and mortality rates in those affected and reduce hospitalization costs. We have developed and validated a multivariate prediction model that predicts delirium and gives an early warning to physicians. A large set of patient electronic medical records have been used in developing the models. Classical learning algorithms have been used to develop the models and compared the results. Excellent results were obtained with the feature set and parameter settings attaining accuracy of 84%.
Aires, Filipe; Rossow, William B.; Hansen, James E. (Technical Monitor)
2001-01-01
A new approach is presented for the analysis of feedback processes in a nonlinear dynamical system by observing its variations. The new methodology consists of statistical estimates of the sensitivities between all pairs of variables in the system based on a neural network modeling of the dynamical system. The model can then be used to estimate the instantaneous, multivariate and nonlinear sensitivities, which are shown to be essential for the analysis of the feedbacks processes involved in the dynamical system. The method is described and tested on synthetic data from the low-order Lorenz circulation model where the correct sensitivities can be evaluated analytically.
DEFF Research Database (Denmark)
Skjærbæk, P. S.; Nielsen, Søren R. K.; Kirkegaard, Poul Henning
1997-01-01
The scope of the paper is to apply multi-variate time-domain models for identification of eginfrequencies and mode shapes of a time- invariant model test Reinforced Concrete (RC) frame from measured decays. The frequencies and mode shapes of interest are the two lowest ones since they are normally...... the only ones activated in ground motion shaking of structures. For purely frequency identification, FFT, ARV, ERA and ARMAV models are applied and for mode shape identification, multi-variate ARV and ARMAV models and the ERA are used. Furthermore the results of a finite element analysis are included...... the second mode. It is found that the estimates of the frequency, damping ratio and mode shape for the first mode estimated by the multi-variate ARV, ARMAV and the ERA give nearly identical results for both types of excitation. Also the estimates of the frequency, damping ratio and mode shape of the second...
DEFF Research Database (Denmark)
Skjærbæk, P. S.; Nielsen, Søren R. K.; Kirkegaard, Poul Henning
The scope of the paper is to apply multi-variate time-domain models for identification of eginfrequencies and mode shapes of a time- invariant model test Reinforced Concrete (RC) frame from measured decays. The frequencies and mode shapes of interest are the two lowest ones since they are normally...... the only ones activated in ground motion shaking of structures. For purely frequency identification, FFT, ARV, ERA and ARMAV models are applied and for mode shape identification, multi-variate ARV and ARMAV models and the ERA are used. Furthermore the results of a finite element analysis are included...... the second mode. It is found that the estimates of the frequency, damping ratio and mode shape for the first mode estimated by the multi-variate ARV, ARMAV and the ERA give nearly identical results for both types of excitation. Also the estimates of the frequency, damping ratio and mode shape of the second...
Widodo, Edy; Kariyam
2017-03-01
To determine the input variable settings that create the optimal compromise in response variable used Response Surface Methodology (RSM). There are three primary steps in the RSM problem, namely data collection, modelling, and optimization. In this study focused on the establishment of response surface models, using the assumption that the data produced is correct. Usually the response surface model parameters are estimated by OLS. However, this method is highly sensitive to outliers. Outliers can generate substantial residual and often affect the estimator models. Estimator models produced can be biased and could lead to errors in the determination of the optimal point of fact, that the main purpose of RSM is not reached. Meanwhile, in real life, the collected data often contain some response variable and a set of independent variables. Treat each response separately and apply a single response procedures can result in the wrong interpretation. So we need a development model for the multi-response case. Therefore, it takes a multivariate model of the response surface that is resistant to outliers. As an alternative, in this study discussed on M-estimation as a parameter estimator in multivariate response surface models containing outliers. As an illustration presented a case study on the experimental results to the enhancement of the surface layer of aluminium alloy air by shot peening.
Energy Technology Data Exchange (ETDEWEB)
Tanner, Scott D. [Institute of Biomaterials and Biomedical Engineering, University of Toronto, Room 407, 164 College Street, Toronto, Ontario, M5S 3G9 (Canada)], E-mail: sd.tanner@utoronto.ca; Ornatsky, Olga; Bandura, Dmitry R.; Baranov, Vladimir I. [Institute of Biomaterials and Biomedical Engineering, University of Toronto, Room 407, 164 College Street, Toronto, Ontario, M5S 3G9 (Canada)
2007-03-15
Recent progress in the development of massively multiplexed bioanalytical assays using element tags with inductively coupled plasma mass spectrometry detection is reviewed. Feasibility results using commercially available secondary immunolabeling reagents for leukemic cell lines are presented. Multiplex analysis of higher order is shown with first generation tag reagents based on functionalized carriers that bind lanthanide ions. DNA quantification using metallointercalation allows for cell enumeration or mitotic state differentiation. In situ hybridization permits the determination of cellular RNA. The results provide a feasibility basis for the development of a multivariate assay tool for individual cell analysis based on inductively coupled plasma mass spectrometry in a cytometer configuration.
Cross-correlations and joint gaussianity in multivariate level crossing models.
Di Bernardino, Elena; León, José; Tchumatchenko, Tatjana
2014-04-17
A variety of phenomena in physical and biological sciences can be mathematically understood by considering the statistical properties of level crossings of random Gaussian processes. Notably, a growing number of these phenomena demand a consideration of correlated level crossings emerging from multiple correlated processes. While many theoretical results have been obtained in the last decades for individual Gaussian level-crossing processes, few results are available for multivariate, jointly correlated threshold crossings. Here, we address bivariate upward crossing processes and derive the corresponding bivariate Central Limit Theorem as well as provide closed-form expressions for their joint level-crossing correlations.
Fuchs, Julia; Cermak, Jan; Andersen, Hendrik
2017-04-01
This study aims at untangling the impacts of external dynamics and local conditions on cloud properties in the Southeast Atlantic (SEA) by combining satellite and reanalysis data using multivariate statistics. The understanding of clouds and their determinants at different scales is important for constraining the Earth's radiative budget, and thus prominent in climate-system research. In this study, SEA stratocumulus cloud properties are observed not only as the result of local environmental conditions but also as affected by external dynamics and spatial origins of air masses entering the study area. In order to assess to what extent cloud properties are impacted by aerosol concentration, air mass history, and meteorology, a multivariate approach is conducted using satellite observations of aerosol and cloud properties (MODIS, SEVIRI), information on aerosol species composition (MACC) and meteorological context (ERA-Interim reanalysis). To account for the often-neglected but important role of air mass origin, information on air mass history based on HYSPLIT modeling is included in the statistical model. This multivariate approach is intended to lead to a better understanding of the physical processes behind observed stratocumulus cloud properties in the SEA.
Directory of Open Access Journals (Sweden)
Qi-an Chen
2015-01-01
Full Text Available Taking full advantage of the strengths of G-H distribution, Copula function, and GARCH model in depicting the return distribution of financial asset, we construct the multivariate time-varying G-H Copula GARCH model which can comprehensively describe “asymmetric, leptokurtic, and heavy-tail” characteristics, the time-varying volatility characteristics, and the extreme-tail dependence characteristics of financial asset return. Based on the conditional maximum likelihood estimator and IFM method, we propose the estimation algorithm of model parameters. Using the quantile function and simulation method, we propose the calculation algorithm of VaR on the basis of this model. To apply this model on studying a real financial market risk, we select the SSCI (China, HSI (Hong Kong, China, TAIEX (Taiwan, China, and SP500 (USA from January 3, 2000, to June 18, 2010, as the samples to estimate the model parameters and to measure the VaRs of various index risk portfolios under different confidence levels empirically. The results of the application example are in line with the actual situation and the risk diversification theory of portfolio. To a certain extent, these results also justify the feasibility and effectiveness of the multivariate time-varying G-H Copula GARCH model in depicting the return distribution of financial assets.
Directory of Open Access Journals (Sweden)
Jui-Yang eChang
2012-11-01
Full Text Available A multivariate autoregressive model with exogenous inputs is developed for describing the cortical interactions excited by direct electrical current stimulation of the cortex. Current stimulation is challenging to model because it excites neurons in multiple locations both near and distant to the stimulation site. The approach presented here models these effects using an exogenous input that is passed through a bank of filters, one for each channel. The filtered input and a random input excite a multivariate autoregressive system describing the interactions between cortical activity at the recording sites. The exogenous input filter coefficients, the autoregressive coefficients, and random input characteristics are estimated from the measured activity due to current stimulation. The effectiveness of the approach is demonstrated using intracranial recordings from three surgical epilepsy patients. We evaluate models for wakefulness and NREM sleep in these patients with two stimulation levels in one patient and two stimulation sites in another resulting in a total of ten datasets. Excellent agreement between measured and model-predicted evoked responses is obtained across all datasets. Furthermore, one-step prediction is used to show that the model also describes dynamics in prestimulus and evoked recordings. We also compare integrated information --- a measure of intracortical communication thought to reflect the capacity for consciousness --- associated with the network model in wakefulness and sleep. As predicted, higher information integration is found in wakefulness than in sleep for all five cases.
Interpreting support vector machine models for multivariate group wise analysis in neuroimaging.
Gaonkar, Bilwaj; T Shinohara, Russell; Davatzikos, Christos
2015-08-01
Machine learning based classification algorithms like support vector machines (SVMs) have shown great promise for turning a high dimensional neuroimaging data into clinically useful decision criteria. However, tracing imaging based patterns that contribute significantly to classifier decisions remains an open problem. This is an issue of critical importance in imaging studies seeking to determine which anatomical or physiological imaging features contribute to the classifier's decision, thereby allowing users to critically evaluate the findings of such machine learning methods and to understand disease mechanisms. The majority of published work addresses the question of statistical inference for support vector classification using permutation tests based on SVM weight vectors. Such permutation testing ignores the SVM margin, which is critical in SVM theory. In this work we emphasize the use of a statistic that explicitly accounts for the SVM margin and show that the null distributions associated with this statistic are asymptotically normal. Further, our experiments show that this statistic is a lot less conservative as compared to weight based permutation tests and yet specific enough to tease out multivariate patterns in the data. Thus, we can better understand the multivariate patterns that the SVM uses for neuroimaging based classification.
Directory of Open Access Journals (Sweden)
Luis Cláudio Lemos Correia
Full Text Available Abstract Background: Currently, there is no validated multivariate model to predict probability of obstructive coronary disease in patients with acute chest pain. Objective: To develop and validate a multivariate model to predict coronary artery disease (CAD based on variables assessed at admission to the coronary care unit (CCU due to acute chest pain. Methods: A total of 470 patients were studied, 370 utilized as the derivation sample and the subsequent 100 patients as the validation sample. As the reference standard, angiography was required to rule in CAD (stenosis ≥ 70%, while either angiography or a negative noninvasive test could be used to rule it out. As predictors, 13 baseline variables related to medical history, 14 characteristics of chest discomfort, and eight variables from physical examination or laboratory tests were tested. Results: The prevalence of CAD was 48%. By logistic regression, six variables remained independent predictors of CAD: age, male gender, relief with nitrate, signs of heart failure, positive electrocardiogram, and troponin. The area under the curve (AUC of this final model was 0.80 (95% confidence interval [95%CI] = 0.75 - 0.84 in the derivation sample and 0.86 (95%CI = 0.79 - 0.93 in the validation sample. Hosmer-Lemeshow's test indicated good calibration in both samples (p = 0.98 and p = 0.23, respectively. Compared with a basic model containing electrocardiogram and troponin, the full model provided an AUC increment of 0.07 in both derivation (p = 0.0002 and validation (p = 0.039 samples. Integrated discrimination improvement was 0.09 in both derivation (p < 0.001 and validation (p < 0.0015 samples. Conclusion: A multivariate model was derived and validated as an accurate tool for estimating the pretest probability of CAD in patients with acute chest pain.
M.G.E. Verdam; F.J. Oort
2014-01-01
Highlights: - Application of Kronecker product to construct parsimonious structural equation models for multivariate longitudinal data. - A method for the investigation of measurement bias with Kronecker product restricted models. - Application of these methods to health-related quality of life data
Directory of Open Access Journals (Sweden)
Gholamreza Oskrochi
2016-03-01
Full Text Available In this study, four major muscles acting on the scapula were investigated in patients who had been treated in the last six years for unilateral carcinoma of the breast. Muscle activity was assessed by electromyography during abduction and adduction of the affected and unaffected arms. The main principal aim of the study was to compare shoulder muscle activity in the affected and unaffected shoulder during elevation of the arm. A multivariate linear mixed model was introduced and applied to address the principal aims. The result of fitting this model to the data shows a huge improvement as compared to the alternatives.
Khoshravesh, Mojtaba; Sefidkouhi, Mohammad Ali Gholami; Valipour, Mohammad
2017-07-01
The proper evaluation of evapotranspiration is essential in food security investigation, farm management, pollution detection, irrigation scheduling, nutrient flows, carbon balance as well as hydrologic modeling, especially in arid environments. To achieve sustainable development and to ensure water supply, especially in arid environments, irrigation experts need tools to estimate reference evapotranspiration on a large scale. In this study, the monthly reference evapotranspiration was estimated by three different regression models including the multivariate fractional polynomial (MFP), robust regression, and Bayesian regression in Ardestan, Esfahan, and Kashan. The results were compared with Food and Agriculture Organization (FAO)-Penman-Monteith (FAO-PM) to select the best model. The results show that at a monthly scale, all models provided a closer agreement with the calculated values for FAO-PM ( R 2 > 0.95 and RMSE < 12.07 mm month-1). However, the MFP model gives better estimates than the other two models for estimating reference evapotranspiration at all stations.
Bevacqua, Emanuele; Maraun, Douglas; Hobæk Haff, Ingrid; Widmann, Martin; Vrac, Mathieu
2017-06-01
Compound events (CEs) are multivariate extreme events in which the individual contributing variables may not be extreme themselves, but their joint - dependent - occurrence causes an extreme impact. Conventional univariate statistical analysis cannot give accurate information regarding the multivariate nature of these events. We develop a conceptual model, implemented via pair-copula constructions, which allows for the quantification of the risk associated with compound events in present-day and future climate, as well as the uncertainty estimates around such risk. The model includes predictors, which could represent for instance meteorological processes that provide insight into both the involved physical mechanisms and the temporal variability of compound events. Moreover, this model enables multivariate statistical downscaling of compound events. Downscaling is required to extend the compound events' risk assessment to the past or future climate, where climate models either do not simulate realistic values of the local variables driving the events or do not simulate them at all. Based on the developed model, we study compound floods, i.e. joint storm surge and high river runoff, in Ravenna (Italy). To explicitly quantify the risk, we define the impact of compound floods as a function of sea and river levels. We use meteorological predictors to extend the analysis to the past, and get a more robust risk analysis. We quantify the uncertainties of the risk analysis, observing that they are very large due to the shortness of the available data, though this may also be the case in other studies where they have not been estimated. Ignoring the dependence between sea and river levels would result in an underestimation of risk; in particular, the expected return period of the highest compound flood observed increases from about 20 to 32 years when switching from the dependent to the independent case.
Directory of Open Access Journals (Sweden)
E. Bevacqua
2017-06-01
Full Text Available Compound events (CEs are multivariate extreme events in which the individual contributing variables may not be extreme themselves, but their joint – dependent – occurrence causes an extreme impact. Conventional univariate statistical analysis cannot give accurate information regarding the multivariate nature of these events. We develop a conceptual model, implemented via pair-copula constructions, which allows for the quantification of the risk associated with compound events in present-day and future climate, as well as the uncertainty estimates around such risk. The model includes predictors, which could represent for instance meteorological processes that provide insight into both the involved physical mechanisms and the temporal variability of compound events. Moreover, this model enables multivariate statistical downscaling of compound events. Downscaling is required to extend the compound events' risk assessment to the past or future climate, where climate models either do not simulate realistic values of the local variables driving the events or do not simulate them at all. Based on the developed model, we study compound floods, i.e. joint storm surge and high river runoff, in Ravenna (Italy. To explicitly quantify the risk, we define the impact of compound floods as a function of sea and river levels. We use meteorological predictors to extend the analysis to the past, and get a more robust risk analysis. We quantify the uncertainties of the risk analysis, observing that they are very large due to the shortness of the available data, though this may also be the case in other studies where they have not been estimated. Ignoring the dependence between sea and river levels would result in an underestimation of risk; in particular, the expected return period of the highest compound flood observed increases from about 20 to 32 years when switching from the dependent to the independent case.
Directory of Open Access Journals (Sweden)
Yang Yu
2013-01-01
Full Text Available Based on a brief review on current harmonics generation mechanism for grid-connected inverter under distorted grid voltage, the harmonic disturbances and uncertain items are immersed into the original state-space differential equation of grid-connected inverter. A new algorithm of global current harmonic rejection based on nonlinear backstepping control with multivariable internal model principle is proposed for grid-connected inverter with exogenous disturbances and uncertainties. A type of multivariable internal model for a class of nonlinear harmonic disturbances is constructed. Based on application of backstepping control law of the nominal system, a multivariable adaptive state feedback controller combined with multivariable internal model and adaptive control law is designed to guarantee the closed-loop system globally uniformly bounded, which is proved by a constructed Lyapunov function. The presented algorithm extends rejection of nonlinear single-input systems to multivariable globally defined normal form, the correctness and effectiveness of which are verified by the simulation results.
Majdan, Marek; Brazinova, Alexandra; Rusnak, Martin; Leitgeb, Johannes
2017-01-01
Objectives: Prognosis of outcome after traumatic brain injury (TBI) is important in the assessment of quality of care and can help improve treatment and outcome. The aim of this study was to compare the prognostic value of relatively simple injury severity scores between each other and against a gold standard model – the IMPACT-extended (IMP-E) multivariable prognostic model. Materials and Methods: For this study, 866 patients with moderate/severe TBI from Austria were analyzed. The prognostic performances of the Glasgow coma scale (GCS), GCS motor (GCSM) score, abbreviated injury scale for the head region, Marshall computed tomographic (CT) classification, and Rotterdam CT score were compared side-by-side and against the IMP-E score. The area under the receiver operating characteristics curve (AUC) and Nagelkerke's R2 were used to assess the prognostic performance. Outcomes at the Intensive Care Unit, at hospital discharge, and at 6 months (mortality and unfavorable outcome) were used as end-points. Results: Comparing AUCs and R2s of the same model across four outcomes, only little variation was apparent. A similar pattern is observed when comparing the models between each other: Variation of AUCs 0.83 and R2 > 0.42 for all outcomes): AUCs were worse by 0.10–0.22 (P prognosis. However, it is confirmed that well-developed multivariable prognostic models outperform these scores significantly and should be used for prognosis in patients after TBI wherever possible.
Lionello, Marco; Staffieri, Claudia; Breda, Stefano; Turato, Chiara; Giacomelli, Luciano; Magnavita, Paola; de Filippis, Cosimo; Staffieri, Alberto; Marioni, Gino
2015-08-01
With a worldwide incidence estimated at 8-15 per 100,000 population a year, idiopathic sudden sensorineural hearing loss (ISSHL) is a common clinical finding for otologists. There is a shortage of information on the clinical factors capable of predicting hearing recovery and response to therapy. The aim of the present study was to retrospectively investigate the prognostic value of clinical variables in relation to hearing recovery, in a cohort of 117 consecutive patients with ISSHL. Clinical parameters (signs, symptoms, comorbidities and treatments) and audiometric data were analyzed with univariate and multivariate statistical approaches for prognostic purposes to identify any correlation with hearing recovery, also expressed according to the Wilson criteria. Univariate analysis showed that age and hypertension were significantly related to hearing outcome (p = 0.004 and p = 0.015, respectively). Elderly patients and those with hypertension were at higher risk of experiencing no hearing recovery (OR = 3.25 and OR = 2.89, respectively). Age was an independent prognostic factor on multivariate analysis (p = 0.007). Tinnitus as a presenting symptom showed a trend towards an association with hearing recovery (p = 0.07). The treatment regimen, the time elapsing between the onset of symptoms and the start of therapy (p = 0.34), and the duration of the treatment (p = 0.83) were unrelated to recovery on univariate analysis. Among the parameters considered, only age was significantly and independently related to hearing outcome. There is a need for well-designed, randomized clinical trials to enable an evidence-based protocol to be developed for the treatment of ISSHL.
Lourenço, Vera; Herdling, Thorsten; Reich, Gabriele; Menezes, José C; Lochmann, Dirk
2011-08-01
A set of 192 fluid bed granulation batches at industrial scale were in-line monitored using microwave resonance technology (MRT) to determine moisture, temperature and density of the granules. Multivariate data analysis techniques such as multiway partial least squares (PLS), multiway principal component analysis (PCA) and multivariate batch control charts were applied onto collected batch data sets. The combination of all these techniques, along with off-line particle size measurements, led to significantly increased process understanding. A seasonality effect could be put into evidence that impacted further processing through its influence on the final granule size. Moreover, it was demonstrated by means of a PLS that a relation between the particle size and the MRT measurements can be quantitatively defined, highlighting a potential ability of the MRT sensor to predict information about the final granule size. This study has contributed to improve a fluid bed granulation process, and the process knowledge obtained shows that the product quality can be built in process design, following Quality by Design (QbD) and Process Analytical Technology (PAT) principles.
Multivariate bubbles and antibubbles
Fry, John
2014-08-01
In this paper we develop models for multivariate financial bubbles and antibubbles based on statistical physics. In particular, we extend a rich set of univariate models to higher dimensions. Changes in market regime can be explicitly shown to represent a phase transition from random to deterministic behaviour in prices. Moreover, our multivariate models are able to capture some of the contagious effects that occur during such episodes. We are able to show that declining lending quality helped fuel a bubble in the US stock market prior to 2008. Further, our approach offers interesting insights into the spatial development of UK house prices.
Horton, Rebecca B; McConico, Morgan; Landry, Currie; Tran, Tho; Vogt, Frank
2012-10-09
Innovations in chemometrics are required for studies of chemical systems which are governed by nonlinear responses to chemical parameters and/or interdependencies (coupling) among these parameters. Conventional and linear multivariate models have limited use for quantitative and qualitative investigations of such systems because they are based on the assumption that the measured data are simple superpositions of several input parameters. 'Predictor Surfaces' were developed for studies of more chemically complex systems such as biological materials in order to ensure accurate quantitative analyses and proper chemical modeling for in-depth studies of such systems. Predictor Surfaces are based on approximating nonlinear multivariate model functions by multivariate Taylor expansions which inherently introduce the required coupled and higher-order predictor variables. As proof-of-principle for the Predictor Surfaces' capabilities, an application from environmental analytical chemistry was chosen. Microalgae cells are known to sensitively adapt to changes in environmental parameters such as pollution and/or nutrient availability and thus have potential as novel in situ sensors for environmental monitoring. These adaptations of the microalgae cells are reflected in their chemical signatures which were then acquired by means of FT-IR spectroscopy. In this study, the concentrations of three nutrients, namely inorganic carbon and two nitrogen containing ions, were chosen. Biological considerations predict that changes in nutrient availability produce a nonlinear response in the cells' biomass composition; it is also known that microalgae need certain nutrient mixes to thrive. The nonlinear Predictor Surfaces were demonstrated to be more accurate in predicting the values of these nutrients' concentrations than principal component regression. For qualitative chemical studies of biological systems, the Predictor Surfaces themselves are a novel tool as they visualize
Naguib, Mohamed; Scamman, Franklin L; O'Sullivan, Cormac; Aker, John; Ross, Alan F; Kosmach, Steven; Ensor, Joe E
2006-03-01
We performed a case-controlled, double-blind study to examine the performance of three multivariate clinical models (Wilson, Arné, and Naguib models) in the prediction of unanticipated difficult intubation. The study group consisted of 97 patients in whom an unanticipated difficult intubation had occurred. For each difficult intubation patient, a matched control patient was selected in whom tracheal intubation had been easily accomplished. Postoperatively, a blinded investigator evaluated both patients. The clinical assessment included the patient's weight, height, age, Mallampati score, interincisor gap, thyromental distance, thyrosternal distance, neck circumference, Wilson risk sum score, history of previous difficult intubation, and diseases associated with difficult laryngoscopy or intubation. The Naguib model was significantly more sensitive (81.4%; P thyromental distance, Mallampati score, interincisor gap, and height. This model is 82.5% sensitive and 85.6% specific with an area under the receiver operating characteristic curve of 0.90.
Modelling in Medical Technology Assessment
B.C. Michel (Bowine)
1996-01-01
textabstractHealth care is a rapidly developing field in which new technologies are introduced continuously. Not all new technologies have the same impact however: most represent only small changes in existing technologies, whereas only a few - like organ transplants - really are revolutionary new d
DEFF Research Database (Denmark)
Labouriau, Rodrigo
The theory of exponential dispersion models (EDM), for which Bent Jørgensen made substantial contributions, provides a flexible framework of models alternative to the classic Gaussian linear models (e.g. generalized linear models and additive models). I review some multivariate extensions of thos...... aux EDMs pour bien représenter et interpréter les questions biologiques d'intérêt. Bent Jørgensen prônait des idées similaires dans son travail depuis les années 1980....
Directory of Open Access Journals (Sweden)
Zouheir Mighri
2014-12-01
Full Text Available The aim of this article is to examine how the dynamics of correlations between two emerging countries (Brazil and Mexico and the US evolved from January 2003 to December 2013. The main contribution of this study is to explore whether the plunging stock market in the US, in the aftermath of global financial crisis (2007–2009, exerts contagion effects on emerging stock markets. To this end, we rely on a multivariate fractionally integrated asymmetric power autoregressive conditional heteroskedasticity dynamic conditional correlation framework, which accounts for long memory, power effects, leverage terms, and time-varying correlations. The empirical analysis shows a contagion effect for Brazil and Mexico during the early stages of the global financial crisis, indicating signs of “recoupling.” Nevertheless, linkages show a general pattern of “decoupling” after the Lehman Brothers collapse. Furthermore, correlations between Brazil and the US are decreased from early 2009 onwards, implying that their dependence is larger in bearish than in bullish markets.
Seismic Physical Modeling Technology and Its Applications
Institute of Scientific and Technical Information of China (English)
无
2006-01-01
This paper introduces the seismic physical modeling technology in the CNPC Key Lab of Geophysical Exploration. It includes the seismic physical model positioning system, the data acquisition system, sources, transducers,model materials, model building techniques, precision measurements of model geometry, the basic principles of the seismic physical modeling and experimental methods, and two physical model examples.
Directory of Open Access Journals (Sweden)
Dimitrios Kourkoutas
2009-04-01
Full Text Available Dimitrios Kourkoutas1,2, Gerasimos Georgopoulos1, Antonios Maragos1, et al1Department of Ophthalmology, Medical School, Athens University, Athens, Greece; 2Department of Ophthalmology, 417 Hellenic Army Shared Fund Hospital, Athens, GreecePurpose: In this paper a new nonlinear multivariable regression method is presented in order to investigate the relationship between the central corneal thickness (CCT and the Heidelberg Retina Tomograph (HRTII optic nerve head (ONH topographic measurements, in patients with established glaucoma.Methods: Forty nine eyes of 49 patients with glaucoma were included in this study. Inclusion criteria were patients with (a HRT II ONH imaging of good quality (SD 30 < μm, (b reliable Humphrey visual field tests (30-2 program, and (c bilateral CCT measurements with ultrasonic contact pachymetry. Patients were classified as glaucomatous based on visual field and/or ONH damage. The relationship between CCT and topographic parameters was analyzed by using the new nonlinear multivariable regression model.Results: In the entire group, CCT was 549.78 ± 33.08 μm (range: 484–636 μm; intraocular pressure (IOP was 16.4 ± 2.67 mmHg (range: 11–23 mmHg; MD was −3.80 ± 4.97 dB (range: 4.04 – [−20.4] dB; refraction was −0.78 ± 2.46 D (range: −6.0 D to +3.0 D. The new nonlinear multivariable regression model we used indicated that CCT was significantly related (R2 = 0.227, p < 0.01 with rim volume nasally and type of diagnosis.Conclusions: By using the new nonlinear multivariable regression model, in patients with established glaucoma, our data showed that there is a statistically significant correlation between CCT and HRTII ONH structural measurements, in glaucoma patients.Keywords: central corneal thickness, glaucoma, optic nerve head, HRT
Cheng, Wen; Gill, Gurdiljot Singh; Dasu, Ravi; Xie, Meiquan; Jia, Xudong; Zhou, Jiao
2017-02-01
Most of the studies are focused on the general crashes or total crash counts with considerably less research dedicated to different crash types. This study employs the Systemic approach for detection of hotspots and comprehensively cross-validates five multivariate models of crash type-based HSID methods which incorporate spatial and temporal random effects. It is anticipated that comparison of the crash estimation results of the five models would identify the impact of varied random effects on the HSID. The data over a ten year time period (2003-2012) were selected for analysis of a total 137 intersections in the City of Corona, California. The crash types collected in this study include: Rear-end, Head-on, Side-swipe, Broad-side, Hit object, and Others. Statistically significant correlations among crash outcomes for the heterogeneity error term were observed which clearly demonstrated their multivariate nature. Additionally, the spatial random effects revealed the correlations among neighboring intersections across crash types. Five cross-validation criteria which contains, Residual Sum of Squares, Kappa, Mean Absolute Deviation, Method Consistency Test, and Total Rank Difference, were applied to assess the performance of the five HSID methods at crash estimation. In terms of accumulated results which combined all crash types, the model with spatial random effects consistently outperformed the other competing models with a significant margin. However, the inclusion of spatial random effect in temporal models fell short of attaining the expected results. The overall observation from the model fitness and validation results failed to highlight any correlation among better model fitness and superior crash estimation.
Robitaille, Annie; Muniz, Graciela; Piccinin, Andrea M; Johansson, Boo; Hofer, Scott M
2012-01-01
We illustrate the use of the parallel latent growth curve model using data from OCTO-Twin. We found a significant intercept-intercept and slope-slope association between processing speed and visuospatial ability. Within-person correlations among the occasion-specific residuals were significant, suggesting that the occasion-specific fluctuations around individual's trajectories, after controlling for intraindividual change, are related between both outcomes. Random and fixed effects for visuospatial ability are reduced when we include structural parameters (directional growth curve model) providing information about changes in visuospatial abilities after controlling for processing speed. We recommend this model to researchers interested in the analysis of multivariate longitudinal change, as it permits decomposition and directly interpretable estimates of association among initial levels, rates of change, and occasion-specific variation.
Energy Technology Data Exchange (ETDEWEB)
Batlle, C.; Barquin, J. [Universidad Pontifica Comillas, Madrid (Spain). Instituto de Investigacion Tecnologica
2004-05-01
This paper presents a fuel prices scenario generator in the frame of a simulation tool developed to support risk analysis in a competitive electricity environment. The tool feeds different erogenous risk factors to a wholesale electricity market model to perform a statistical analysis of the results. As the different fuel series that are studied, such as the oil or gas ones, present stochastic volatility and strong correlation among them, a multivariate Generalized Autoregressive Conditional Heteroskedastic (GARCH) model has been designed in order to allow the generation of future fuel prices paths. The model makes use of a decomposition method to simplify the consideration of the multidimensional conditional covariance. An example of its application with real data is also presented. (author)
An age structured demographic model of technology
Mercure, J -F
2013-01-01
At the heart of technology transitions lie complex processes of technology choices. Understanding and planning sustainability transitions requires modelling work, which necessitates a theory of technology substitution. A theoretical model of technological change and turnover is presented, intended as a methodological paradigm shift from widely used conventional modelling approaches such as cost optimisation. It follows the tradition of evolutionary economics and evolutionary game theory, using ecological population growth dynamics to represent the evolution of technology populations in the marketplace, with substitutions taking place at the level of the decision-maker. Extended to use principles of human demography or the age structured evolution of species in interacting ecosystems, this theory is built from first principles, and through an appropriate approximation, reduces to a form identical to empirical models of technology diffusion common in the technology transitions literature. Using an age structure...
2016-09-23
ES) 8. PERFORMING ORGANIZATION 1Oak Ridge Institute for Science and Education , Wright-Patterson AFB, OH; 2Air Force Research Laboratory, Wright...Warfighter Interface Division Applied Neuroscience Branch Wright-Patterson Air Force Base, OH 45433 AGENCY ACRONYM(S) 711 HPW/RHCP... Education , Wright-Patterson AFB, OH; 2Air Force Research Laboratory, Wright-Patterson AFB, OH; 3Ball Aerospace and Technologies Corporation, Dayton
Collins, G. S.; Reitsma, J. B.; Altman, D. G.; Moons, K. G M|info:eu-repo/dai/nl/152483519
2015-01-01
Background: Prediction models are developed to aid healthcare providers in estimating the probability or risk that a specific disease or condition is present (diagnostic models) or that a specific event will occur in the future (prognostic models), to inform their decision-making. However, the overw
Collins, G. S.; Reitsma, J. B.; Altman, D. G.; Moons, K. G M|info:eu-repo/dai/nl/152483519
2015-01-01
Prediction models are developed to aid healthcare providers in estimating the probability or risk that a specific disease or condition is present (diagnostic models) or that a specific event will occur in the future (prognostic models), to inform their decision-making. However, the overwhelming evid
DEFF Research Database (Denmark)
Hansen, Peter Reinhard; Lunde, Asger; Voev, Valeri
the model to market returns in conjunction with an individual asset yields a model for the conditional regression coefficient, known as the beta. We apply the model to a set of highly liquid stocks and find that conditional betas are much more variable than usually observed with rolling-window OLS...
The affine constrained GNSS attitude model and its multivariate integer least-squares solution
Teunissen, P.J.G.
2012-01-01
A new global navigation satellite system (GNSS) carrier-phase attitude model and its solution are introduced in this contribution. This affine-constrained GNSS attitude model has the advantage that it avoids the computational complexity of the orthonormality-constrained GNSS attitude model, while it
Collins, Gary S; Reitsma, Johannes B; Altman, Douglas G; Moons, Karel G M
2015-01-01
Prediction models are developed to aid health care providers in estimating the probability or risk that a specific disease or condition is present (diagnostic models) or that a specific event will occur in the future (prognostic models), to inform their decision making. However, the overwhelming evi
Directory of Open Access Journals (Sweden)
Eric van Diessen
Full Text Available BACKGROUND: Electroencephalogram (EEG acquisition is routinely performed to support an epileptic origin of paroxysmal events in patients referred with a possible diagnosis of epilepsy. However, in children with partial epilepsies the interictal EEGs are often normal. We aimed to develop a multivariable diagnostic prediction model based on electroencephalogram functional network characteristics. METHODOLOGY/PRINCIPAL FINDINGS: Routinely performed interictal EEG recordings at first presentation of 35 children diagnosed with partial epilepsies, and of 35 children in whom the diagnosis epilepsy was excluded (control group, were used to develop the prediction model. Children with partial epilepsy were individually matched on age and gender with children from the control group. Periods of resting-state EEG, free of abnormal slowing or epileptiform activity, were selected to construct functional networks of correlated activity. We calculated multiple network characteristics previously used in functional network epilepsy studies and used these measures to build a robust, decision tree based, prediction model. Based on epileptiform EEG activity only, EEG results supported the diagnosis of with a sensitivity and specificity of 0.77 and 0.91 respectively. In contrast, the prediction model had a sensitivity of 0.96 [95% confidence interval: 0.78-1.00] and specificity of 0.95 [95% confidence interval: 0.76-1.00] in correctly differentiating patients from controls. The overall discriminative power, quantified as the area under the receiver operating characteristic curve, was 0.89, defined as an excellent model performance. The need of a multivariable network analysis to improve diagnostic accuracy was emphasized by the lack of discriminatory power using single network characteristics or EEG's power spectral density. CONCLUSIONS/SIGNIFICANCE: Diagnostic accuracy in children with partial epilepsy is substantially improved with a model combining functional
McBee, Megan E.; Zeng, Yu; Parry, Nicola; Nagler, Cathryn R.; Tannenbaum, Steven R.
2010-01-01
Background Diagnosis of chronic intestinal inflammation, which characterizes inflammatory bowel disease (IBD), along with prediction of disease state is hindered by the availability of predictive serum biomarker. Serum biomarkers predictive of disease state will improve trials for therapeutic intervention, and disease monitoring, particularly in genetically susceptible individuals. Chronic inflammation during IBD is considered distinct from infectious intestinal inflammation thereby requiring biomarkers to provide differential diagnosis. To address whether differential serum biomarkers could be identified in murine models of colitis, immunological profiles from both chronic spontaneous and acute infectious colitis were compared and predictive serum biomarkers identified via multivariate modeling. Methodology/Principal Findings Discriminatory multivariate modeling of 23 cytokines plus chlorotyrosine and nitrotyrosine (protein adducts from reactive nitrogen species and hypochlorite) in serum and tissue from two murine models of colitis was performed to identify disease-associated biomarkers. Acute C. rodentium-induced colitis in C57BL/6J mice and chronic spontaneous Helicobacter-dependent colitis in TLR4−/− x IL-10−/− mice were utilized for evaluation. Colon profiles of both colitis models were nearly identical with chemokines, neutrophil- and Th17-related factors highly associated with intestinal disease. In acute colitis, discriminatory disease-associated serum factors were not those identified in the colon. In contrast, the discriminatory predictive serum factors for chronic colitis were neutrophil- and Th17-related factors (KC, IL-12/23p40, IL-17, G-CSF, and chlorotyrosine) that were also elevated in colon tissue. Chronic colitis serum biomarkers were specific to chronic colitis as they were not discriminatory for acute colitis. Conclusions/Significance Immunological profiling revealed strikingly similar colon profiles, yet distinctly different serum
Chiu, Chi-Yang; Jung, Jeesun; Chen, Wei; Weeks, Daniel E; Ren, Haobo; Boehnke, Michael; Amos, Christopher I; Liu, Aiyi; Mills, James L; Ting Lee, Mei-Ling; Xiong, Momiao; Fan, Ruzong
2017-02-01
To analyze next-generation sequencing data, multivariate functional linear models are developed for a meta-analysis of multiple studies to connect genetic variant data to multiple quantitative traits adjusting for covariates. The goal is to take the advantage of both meta-analysis and pleiotropic analysis in order to improve power and to carry out a unified association analysis of multiple studies and multiple traits of complex disorders. Three types of approximate F -distributions based on Pillai-Bartlett trace, Hotelling-Lawley trace, and Wilks's Lambda are introduced to test for association between multiple quantitative traits and multiple genetic variants. Simulation analysis is performed to evaluate false-positive rates and power of the proposed tests. The proposed methods are applied to analyze lipid traits in eight European cohorts. It is shown that it is more advantageous to perform multivariate analysis than univariate analysis in general, and it is more advantageous to perform meta-analysis of multiple studies instead of analyzing the individual studies separately. The proposed models require individual observations. The value of the current paper can be seen at least for two reasons: (a) the proposed methods can be applied to studies that have individual genotype data; (b) the proposed methods can be used as a criterion for future work that uses summary statistics to build test statistics to meta-analyze the data.
Institute of Scientific and Technical Information of China (English)
Yan; Han; Shi; Guoxu
2007-01-01
Urban water consumption has some characteristics of grey because it is influenced by economy, population, standard of living and so on. The multi-variable grey model (MGM(1,n)), as the expansion and complement of GM(1,1) model, reveals the relationship between restriction and stimulation among variables, and the genetic algorithm has the whole optimal and parallel characteristics. In this paper, the parameter q of MGM(1,n) model was optimized, and a multi-variable grey model (MGM(1,n,q)) was built by using the genetic algorithm. The model was validated by examining the urban water consumption from 1990 to 2003 in Dalian City. The result indicated that the multi-variable grey model (MGM(1,n,q)) based on genetic algorithm was better than MGM(1,n) model, and the MGM(1,n) model was better than MGM(1,1) model.
Reynolds, C A; Baker, L A; Pedersen, N L
2000-11-01
Phenotypic assortment is assumed to be the principal mechanism of spouse similarity in most biometrical studies. Other assortment mechanisms, such as social homogamy, may be plausible. Two models are presented that consider phenotypic assortment and social homogamy simultaneously (i.e., mixed assortment), where selective associations between social background factors (Model I) versus selective associations between total environments (Model II) distinguish the models. A series of illustrative analyses was undertaken for education and fluid ability available on a sample of 116 Swedish twin pairs and their spouses. On the basis of several fit criteria Model I was preferred over Model II. Both social homogamy and phenotypic assortment may contribute to spouse similarity for educational attainment and fluid ability. Furthermore, spouse similarity for fluid ability may arise indirectly from social homogamy and phenotypic assortment for educational attainment. Power analyses indicated greater observed power for Model I than Model II. Additional power analyses indicated that considerably more twin-spouse sets would be needed for Model II than Model I, to resolve social homogamy and phenotypic assortment. Effects of misspecification of mechanisms of spouse similarity are also briefly discussed.
Technology and Online Education: Models for Change
Cook, Catherine W.; Sonnenberg, Christian
2014-01-01
This paper contends that technology changes advance online education. A number of mobile computing and transformative technologies will be examined and incorporated into a descriptive study. The object of the study will be to design innovative mobile awareness models seeking to understand technology changes for mobile devices and how they can be…
Carisi, Francesca; Domeneghetti, Alessio; Kreibich, Heidi; Schröter, Kai; Castellarin, Attilio
2017-04-01
Flood risk is function of flood hazard and vulnerability, therefore its accurate assessment depends on a reliable quantification of both factors. The scientific literature proposes a number of objective and reliable methods for assessing flood hazard, yet it highlights a limited understanding of the fundamental damage processes. Loss modelling is associated with large uncertainty which is, among other factors, due to a lack of standard procedures; for instance, flood losses are often estimated based on damage models derived in completely different contexts (i.e. different countries or geographical regions) without checking its applicability, or by considering only one explanatory variable (i.e. typically water depth). We consider the Secchia river flood event of January 2014, when a sudden levee-breach caused the inundation of nearly 200 km2 in Northern Italy. In the aftermath of this event, local authorities collected flood loss data, together with additional information on affected private households and industrial activities (e.g. buildings surface and economic value, number of company's employees and others). Based on these data we implemented and compared a quadratic-regression damage function, with water depth as the only explanatory variable, and a multi-variable model that combines multiple regression trees and considers several explanatory variables (i.e. bagging decision trees). Our results show the importance of data collection revealing that (1) a simple quadratic regression damage function based on empirical data from the study area can be significantly more accurate than literature damage-models derived for a different context and (2) multi-variable modelling may outperform the uni-variable approach, yet it is more difficult to develop and apply due to a much higher demand of detailed data.
Directory of Open Access Journals (Sweden)
Marek Majdan
2017-01-01
Full Text Available Objectives: Prognosis of outcome after traumatic brain injury (TBI is important in the assessment of quality of care and can help improve treatment and outcome. The aim of this study was to compare the prognostic value of relatively simple injury severity scores between each other and against a gold standard model – the IMPACT-extended (IMP-E multivariable prognostic model. Materials and Methods: For this study, 866 patients with moderate/severe TBI from Austria were analyzed. The prognostic performances of the Glasgow coma scale (GCS, GCS motor (GCSM score, abbreviated injury scale for the head region, Marshall computed tomographic (CT classification, and Rotterdam CT score were compared side-by-side and against the IMP-E score. The area under the receiver operating characteristics curve (AUC and Nagelkerke's R2 were used to assess the prognostic performance. Outcomes at the Intensive Care Unit, at hospital discharge, and at 6 months (mortality and unfavorable outcome were used as end-points. Results: Comparing AUCs and R2s of the same model across four outcomes, only little variation was apparent. A similar pattern is observed when comparing the models between each other: Variation of AUCs 0.83 and R2 > 0.42 for all outcomes: AUCs were worse by 0.10–0.22 (P < 0.05 and R2s were worse by 0.22–0.39 points. Conclusions: All tested simple scores can provide reasonably valid prognosis. However, it is confirmed that well-developed multivariable prognostic models outperform these scores significantly and should be used for prognosis in patients after TBI wherever possible.
Müller-Tasch, Thomas; Löwe, Bernd; Lossnitzer, Nicole; Frankenstein, Lutz; Täger, Tobias; Haass, Markus; Katus, Hugo; Schultz, Jobst-Hendrik; Herzog, Wolfgang
2017-07-01
While comprehensive evidence exists regarding negative effects of depression on self-care behaviours in patients with chronic heart failure (CHF), the relation between anxiety and self-care behaviours in patients with CHF is not clear. The aim of this study was to analyse the interactions between anxiety, depression and self-care behaviours in patients with CHF. The self-care behaviour of CHF outpatients was measured using the European Heart Failure Self-care Behaviour Scale (EHFScBS). The Patient Health Questionnaire (PHQ) was used to assess anxiety, the PHQ-9 was used to measure depression severity. Differences between patients with and without anxiety were assessed with the respective tests. Associations between anxiety, self-care and other predictors were analysed using linear regressions. Of the 308 participating patients, 35 (11.4%) fulfilled the PHQ criteria for an anxiety disorder. These patients took antidepressants more frequently (11.8% versus 2.3%, p = .02), had had more contacts with their general practitioner within the last year (11.8 ± 16.1 versus 6.7 ± 8.6, p = .02), and had a higher PHQ-9 depression score (12.9 ± 5.7 versus 6.5 ± 4.7, p < .01) than patients without anxiety disorder. Anxiety and self-care were negatively associated (ß = -0.144, r(2) = 0.021, p = 0.015). The explanation of variance was augmented in a multivariate regression with the predictors age, sex, education, living with a partner, and New York Heart Association (NYHA) class ( r(2) = 0.098) when anxiety was added ( r(2) = 0.112). Depression further increased the explanation of variance (ß = -0.161, r(2) = 0.131, p = 0.019). Anxiety is negatively associated with self-care behaviour in patients with CHF. However, this effect disappears behind the stronger influence of depression on self-care. The consideration of mental comorbidities in patients with CHF is important.
DEFF Research Database (Denmark)
Rombouts, Jeroen V.K.; Stentoft, Lars; Violante, Francesco
in their specification of the conditional variance, conditional correlation, and innovation distribution. All models belong to the dynamic conditional correlation class which is particularly suited because it allows to consistently estimate the risk neutral dynamics with a manageable computational effort in relatively...... innovation for a Laplace innovation assumption improves the pricing in a smaller way. Apart from investigating directly the value of model sophistication in terms of dollar losses, we also use the model condence set approach to statistically infer the set of models that delivers the best pricing performance....
Development of Control Models and a Robust Multivariable Controller for Surface Shape Control
Energy Technology Data Exchange (ETDEWEB)
Winters, Scott Eric [Univ. of California, Davis, CA (United States)
2003-06-18
Surface shape control techniques are applied to many diverse disciplines, such as adaptive optics, noise control, aircraft flutter control and satellites, with an objective to achieve a desirable shape for an elastic body by the application of distributed control forces. Achieving the desirable shape is influenced by many factors, such as, actuator locations, sensor locations, surface precision and controller performance. Building prototypes to complete design optimizations or controller development can be costly or impractical. This shortfall, puts significant value in developing accurate modeling and control simulation approaches. This thesis focuses on the field of adaptive optics, although these developments have the potential for application in many other fields. A static finite element model is developed and validated using a large aperture interferometer system. This model is then integrated into a control model using a linear least squares algorithm and Shack-Hartmann sensor. The model is successfully exercised showing functionality for various wavefront aberrations. Utilizing a verified model shows significant value in simulating static surface shape control problems with quantifiable uncertainties. A new dynamic model for a seven actuator deformable mirror is presented and its accuracy is proven through experiment. Bond graph techniques are used to generate the state space model of the multi-actuator deformable mirror including piezo-electric actuator dynamics. Using this verified model, a robust multi-input multi-output (MIMO) H_{∞} controller is designed and implemented. This controller proved superior performance as compared to a standard proportional-integral controller (PI) design.
Anacleto, Osvaldo; Queen, Catriona; Albers, Casper J.
2013-01-01
Traffic flow data are routinely collected for many networks worldwide. These invariably large data sets can be used as part of a traffic management system, for which good traffic flow forecasting models are crucial. The linear multiregression dynamic model (LMDM) has been shown to be promising for f
Directory of Open Access Journals (Sweden)
Zeren Fatma
2010-01-01
Full Text Available This paper tries to examine the long run relationships between the aggregate consumer prices and some cost-based components for the Turkish economy. Based on a simple economic model of the macro-scaled price formation, multivariate cointegration techniques have been applied to test whether the real data support the a priori model construction. The results reveal that all of the factors, related to the price determination, have a positive impact on the consumer prices as expected. We find that the most significant component contributing to the price setting is the nominal exchange rate depreciation. We also cannot reject the linear homogeneity of the sum of all the price data as to the domestic inflation. The paper concludes that the Turkish consumer prices have in fact a strong cost-push component that contributes to the aggregate pricing.
Energy Technology Data Exchange (ETDEWEB)
Kamimura R; Bicciato, S; Shimizu, H; Alford, J; Stephanopoulos, G
2001-05-10
A framework is presented that emphasizes the need to understand the strengths and weaknesses of the data prior to modeling. In short, given a list of constraints, the idea is to let the data sort itself along those guidelines. Once the data has been organized into some coherent faction, the user has a better understanding of what the strengths and weaknesses of the data are as the analysis proceeds. The goal is to understand the character of the data so that the user is not overwhelmed but is able to systematically organize and decompose information so as to facilitate the analysis and build an effective model. The data analyzed is that from an industrial fermentation but the framework presented is generic enough that it can be used in any application involving multivariate time series data, such as time varying microarray measurements.
Sexton, David M. H.; Murphy, James M.
2012-06-01
A method for providing probabilistic climate projections, which applies a Bayesian framework to information from a perturbed physics ensemble, a multimodel ensemble and observations, was demonstrated in an accompanying paper. This information allows us to account for the combined effects of more sources of uncertainty than in any previous study of the equilibrium response to doubled CO2 concentrations, namely parametric and structural modelling uncertainty, internal variability, and observational uncertainty. Such probabilistic projections are dependent on the climate models and observations used but also contain an element of expert judgement. Two expert choices in the methodology involve the amount of information used to (a) specify the effects of structural modelling uncertainty and (b) represent the observational metrics that constrain the probabilistic climate projections. These choices, effected by selecting how many multivariate eigenvectors of a large set of climate variables to retain in our analysis, are investigated in more detail. We also show sensitivity tests that explore a range of key expert choices. For changes in annual global mean temperature and regional changes over England and Wales and Northern Europe, the variations in the projections across the sensitivity studies are small compared to the overall uncertainty, demonstrating that the projections are robust to reasonable variations in key assumptions. We are therefore confident that, despite sampling sources of uncertainty more comprehensively than previously, the improved multivariate treatment of observational metrics has narrowed the probability distribution of climate sensitivity consistent with evidence currently available. Our 5th, 50th, and 95th percentiles are in the range 2.2-2.4, 3.2-3.3, and 4.1-4.5K, respectively. The main caveat is that the handling of structural uncertainty does not account for systematic errors common to the current set of climate models and finding methods to
Multivariate Statistical Models for Predicting Sediment Yields from Southern California Watersheds
Gartner, Joseph E.; Cannon, Susan H.; Helsel, Dennis R.; Bandurraga, Mark
2009-01-01
Debris-retention basins in Southern California are frequently used to protect communities and infrastructure from the hazards of flooding and debris flow. Empirical models that predict sediment yields are used to determine the size of the basins. Such models have been developed using analyses of records of the amount of material removed from debris retention basins, associated rainfall amounts, measures of watershed characteristics, and wildfire extent and history. In this study we used multiple linear regression methods to develop two updated empirical models to predict sediment yields for watersheds located in Southern California. The models are based on both new and existing measures of volume of sediment removed from debris retention basins, measures of watershed morphology, and characterization of burn severity distributions for watersheds located in Ventura, Los Angeles, and San Bernardino Counties. The first model presented reflects conditions in watersheds located throughout the Transverse Ranges of Southern California and is based on volumes of sediment measured following single storm events with known rainfall conditions. The second model presented is specific to conditions in Ventura County watersheds and was developed using volumes of sediment measured following multiple storm events. To relate sediment volumes to triggering storm rainfall, a rainfall threshold was developed to identify storms likely to have caused sediment deposition. A measured volume of sediment deposited by numerous storms was parsed among the threshold-exceeding storms based on relative storm rainfall totals. The predictive strength of the two models developed here, and of previously-published models, was evaluated using a test dataset consisting of 65 volumes of sediment yields measured in Southern California. The evaluation indicated that the model developed using information from single storm events in the Transverse Ranges best predicted sediment yields for watersheds in San
Recursive Parameter Method for Computing the Predicting Function of the Multivariable ARMAX Model
Institute of Scientific and Technical Information of China (English)
无
2000-01-01
New method for computing the predicting function of the ARMAX model is proposed. The proposed method constructs a set of schemes for recursively computing the parameters in predicting function of the ARMAX model. In contrast to the existing method, that only gives results for the special case of the ARX model, the method presented is suitable not only for an SISO system, but also for an MIMO system. For the SISO system, the method presented here is even more convenient than the exisiting ones.
Zhang, Xingwu; Gao, Robert X.; Yan, Ruqiang; Chen, Xuefeng; Sun, Chuang; Yang, Zhibo
2016-08-01
Crack is one of the crucial causes of structural failure. A methodology for quantitative crack identification is proposed in this paper based on multivariable wavelet finite element method and particle swarm optimization. First, the structure with crack is modeled by multivariable wavelet finite element method (MWFEM) so that the vibration parameters of the first three natural frequencies in arbitrary crack conditions can be obtained, which is named as the forward problem. Second, the structure with crack is tested to obtain the vibration parameters of first three natural frequencies by modal testing and advanced vibration signal processing method. Then, the analyzed and measured first three natural frequencies are combined together to obtain the location and size of the crack by using particle swarm optimization. Compared with traditional wavelet finite element method, MWFEM method can achieve more accurate vibration analysis results because it interpolates all the solving variables at one time, which makes the MWFEM-based method to improve the accuracy in quantitative crack identification. In the end, the validity and superiority of the proposed method are verified by experiments of both cantilever beam and simply supported beam.
Directory of Open Access Journals (Sweden)
Carlo Baldassi
Full Text Available In the course of evolution, proteins show a remarkable conservation of their three-dimensional structure and their biological function, leading to strong evolutionary constraints on the sequence variability between homologous proteins. Our method aims at extracting such constraints from rapidly accumulating sequence data, and thereby at inferring protein structure and function from sequence information alone. Recently, global statistical inference methods (e.g. direct-coupling analysis, sparse inverse covariance estimation have achieved a breakthrough towards this aim, and their predictions have been successfully implemented into tertiary and quaternary protein structure prediction methods. However, due to the discrete nature of the underlying variable (amino-acids, exact inference requires exponential time in the protein length, and efficient approximations are needed for practical applicability. Here we propose a very efficient multivariate Gaussian modeling approach as a variant of direct-coupling analysis: the discrete amino-acid variables are replaced by continuous Gaussian random variables. The resulting statistical inference problem is efficiently and exactly solvable. We show that the quality of inference is comparable or superior to the one achieved by mean-field approximations to inference with discrete variables, as done by direct-coupling analysis. This is true for (i the prediction of residue-residue contacts in proteins, and (ii the identification of protein-protein interaction partner in bacterial signal transduction. An implementation of our multivariate Gaussian approach is available at the website http://areeweb.polito.it/ricerca/cmp/code.
Chang, Jui-Yang; Pigorini, Andrea; Massimini, Marcello; Tononi, Giulio; Nobili, Lino; Van Veen, Barry D
2012-01-01
A multivariate autoregressive (MVAR) model with exogenous inputs (MVARX) is developed for describing the cortical interactions excited by direct electrical current stimulation of the cortex. Current stimulation is challenging to model because it excites neurons in multiple locations both near and distant to the stimulation site. The approach presented here models these effects using an exogenous input that is passed through a bank of filters, one for each channel. The filtered input and a random input excite a MVAR system describing the interactions between cortical activity at the recording sites. The exogenous input filter coefficients, the autoregressive coefficients, and random input characteristics are estimated from the measured activity due to current stimulation. The effectiveness of the approach is demonstrated using intracranial recordings from three surgical epilepsy patients. We evaluate models for wakefulness and NREM sleep in these patients with two stimulation levels in one patient and two stimulation sites in another resulting in a total of 10 datasets. Excellent agreement between measured and model-predicted evoked responses is obtained across all datasets. Furthermore, one-step prediction is used to show that the model also describes dynamics in pre-stimulus and evoked recordings. We also compare integrated information-a measure of intracortical communication thought to reflect the capacity for consciousness-associated with the network model in wakefulness and sleep. As predicted, higher information integration is found in wakefulness than in sleep for all five cases.
Content-adaptive pentary steganography using the multivariate generalized Gaussian cover model
Sedighi, Vahid; Fridrich, Jessica; Cogranne, Rémi
2015-03-01
The vast majority of steganographic schemes for digital images stored in the raster format limit the amplitude of embedding changes to the smallest possible value. In this paper, we investigate the possibility to further improve the empirical security by allowing the embedding changes in highly textured areas to have a larger amplitude and thus embedding there a larger payload. Our approach is entirely model driven in the sense that the probabilities with which the cover pixels should be changed by a certain amount are derived from the cover model to minimize the power of an optimal statistical test. The embedding consists of two steps. First, the sender estimates the cover model parameters, the pixel variances, when modeling the pixels as a sequence of independent but not identically distributed generalized Gaussian random variables. Then, the embedding change probabilities for changing each pixel by 1 or 2, which can be transformed to costs for practical embedding using syndrome-trellis codes, are computed by solving a pair of non-linear algebraic equations. Using rich models and selection-channel-aware features, we compare the security of our scheme based on the generalized Gaussian model with pentary versions of two popular embedding algorithms: HILL and S-UNIWARD.
Directory of Open Access Journals (Sweden)
Stephen W Hartley
2012-09-01
Full Text Available Genome-wide association studies (GWAS have identified numerous associations between genetic loci and individual phenotypes; however, relatively few GWAS have attempted to detect pleiotropic associations, in which loci are simultaneously associated with multiple distinct phenotypes. We show that pleiotropic associations can be directly modeled via the construction of simple Bayesian networks, and that these models can be applied to produce single or ensembles of Bayesian classifiers that leverage pleiotropy to improve genetic risk prediction.The proposed method includes two phases: (1 Bayesian model comparison, to identify SNPs associated with one or more traits; and (2 cross validation feature selection, in which a final set of SNPs is selected to optimize prediction.To demonstrate the capabilities and limitations of the method, a total of 1600 case-control GWAS datasets with 2 dichotomous phenotypes were simulated under 16 scenarios, varying the association strengths of causal SNPs, the size of the discovery sets, the balance between cases and controls, and the number of pleiotropic causal SNPs.Across the 16 scenarios, prediction accuracy varied from 90% to 50%. In the 14 scenarios that included pleiotropically-associated SNPs, the pleiotropic model search and prediction methods consistently outperformed the naive model search and prediction. In the 2 scenarios in which there were no true pleiotropic SNPs, the differences between the pleiotropic and naive model searches were minimal.
A Model Technology Educator: Thomas A. Edison
Pretzer, William S.; Rogers, George E.; Bush, Jeffery
2007-01-01
Reflecting back over a century ago to the small village of Menlo Park, New Jersey provides insight into a remarkable visionary and an exceptional role model for today's problem-solving and design-focused technology educator: Thomas A. Edison, inventor, innovator, and model technology educator. Since Edison could not simply apply existing knowledge…
Multi-variable mathematical models for the air-cathode microbial fuel cell system
Ou, Shiqi; Kashima, Hiroyuki; Aaron, Douglas S.; Regan, John M.; Mench, Matthew M.
2016-05-01
This research adopted the version control system into the model construction for the single chamber air-cathode microbial fuel cell (MFC) system, to understand the interrelation of biological, chemical, and electrochemical reactions. The anodic steady state model was used to consider the chemical species diffusion and electric migration influence to the MFC performance. In the cathodic steady state model, the mass transport and reactions in a multi-layer, abiotic cathode and multi-bacteria cathode biofilm were simulated. Transport of hydroxide was assumed for cathodic pH change. This assumption is an alternative to the typical notion of proton consumption during oxygen reduction to explain elevated cathode pH. The cathodic steady state model provided the power density and polarization curve performance results that can be compared to an experimental MFC system. Another aspect considered was the relative contributions of platinum catalyst and microbes on the cathode to the oxygen reduction reaction (ORR). Simulation results showed that the biocatalyst in a cathode that includes a Pt/C catalyst likely plays a minor role in ORR, contributing up to 8% of the total power calculated by the models.
Multivariate sensitivity analysis to measure global contribution of input factors in dynamic models
Energy Technology Data Exchange (ETDEWEB)
Lamboni, Matieyendou [INRA, Unite MIA (UR341), F78352 Jouy en Josas Cedex (France); Monod, Herve, E-mail: herve.monod@jouy.inra.f [INRA, Unite MIA (UR341), F78352 Jouy en Josas Cedex (France); Makowski, David [INRA, UMR Agronomie INRA/AgroParisTech (UMR 211), BP 01, F78850 Thiverval-Grignon (France)
2011-04-15
Many dynamic models are used for risk assessment and decision support in ecology and crop science. Such models generate time-dependent model predictions, with time either discretised or continuous. Their global sensitivity analysis is usually applied separately on each time output, but Campbell et al. (2006) advocated global sensitivity analyses on the expansion of the dynamics in a well-chosen functional basis. This paper focuses on the particular case when principal components analysis is combined with analysis of variance. In addition to the indices associated with the principal components, generalised sensitivity indices are proposed to synthesize the influence of each parameter on the whole time series output. Index definitions are given when the uncertainty on the input factors is either discrete or continuous and when the dynamic model is either discrete or functional. A general estimation algorithm is proposed, based on classical methods of global sensitivity analysis. The method is applied to a dynamic wheat crop model with 13 uncertain parameters. Three methods of global sensitivity analysis are compared: the Sobol'-Saltelli method, the extended FAST method, and the fractional factorial design of resolution 6.
Applied multivariate statistical analysis
Härdle, Wolfgang Karl
2015-01-01
Focusing on high-dimensional applications, this 4th edition presents the tools and concepts used in multivariate data analysis in a style that is also accessible for non-mathematicians and practitioners. It surveys the basic principles and emphasizes both exploratory and inferential statistics; a new chapter on Variable Selection (Lasso, SCAD and Elastic Net) has also been added. All chapters include practical exercises that highlight applications in different multivariate data analysis fields: in quantitative financial studies, where the joint dynamics of assets are observed; in medicine, where recorded observations of subjects in different locations form the basis for reliable diagnoses and medication; and in quantitative marketing, where consumers’ preferences are collected in order to construct models of consumer behavior. All of these examples involve high to ultra-high dimensions and represent a number of major fields in big data analysis. The fourth edition of this book on Applied Multivariate ...
Bonne, François; Alamir, Mazen; Bonnay, Patrick; Bradu, Benjamin
2014-01-01
In this paper, a multivariable model-based non-linear controller for Warm Compression Stations (WCS) is proposed. The strategy is to replace all the PID loops controlling the WCS with an optimally designed model-based multivariable loop. This new strategy leads to high stability and fast disturbance rejection such as those induced by a turbine or a compressor stop, a key-aspect in the case of large scale cryogenic refrigeration. The proposed control scheme can be used to have precise control of every pressure in normal operation or to stabilize and control the cryoplant under high variation of thermal loads (such as a pulsed heat load expected to take place in future fusion reactors such as those expected in the cryogenic cooling systems of the International Thermonuclear Experimental Reactor ITER or the Japan Torus-60 Super Advanced fusion experiment JT-60SA). The paper details how to set the WCS model up to synthesize the Linear Quadratic Optimal feedback gain and how to use it. After preliminary tuning at CEA-Grenoble on the 400W@1.8K helium test facility, the controller has been implemented on a Schneider PLC and fully tested first on the CERN's real-time simulator. Then, it was experimentally validated on a real CERN cryoplant. The efficiency of the solution is experimentally assessed using a reasonable operating scenario of start and stop of compressors and cryogenic turbines. This work is partially supported through the European Fusion Development Agreement (EFDA) Goal Oriented Training Program, task agreement WP10-GOT-GIRO.
Djimadoumngar, K. N.; Lee, J.; Bila, M. D.; Djoret, D.; Ichoku, C. M.
2016-12-01
Food security and water shortage from frequent droughts have been major issues in the northern sub-Saharan Africa (NSSA). The shrinking Lake Chad is one of the examples experiencing severe droughts and insecure food production. One of major challenges in the NSSA is the lack of data collection and monitoring systems to support a decision-making process in the agriculture and water resources management. The present study aims to help better understand a hydrologic system of the Lake Chad using multivariate regression models and enhance the models to forecast the river discharge along the Chari-Logone river system, which contributes over 90% of water into the Lake. As regressands, the river discharge data from two monitoring stations at Bongor and Logone-Gana were collected from 2001-2007. The regressors include precipitation, soil moisture, soil and air temperature, specific humidity, evapotranspiration and surface runoff. The Tropical Rainfall Measuring Mission (TRMM) data were used for the precipitation, and all other regressor parameters were obtained from the Global Land Data Assimilation System (GLDAS). We performed cross-correlation analysis between the river discharge and each regressor parameter to quantify the time lag to have the best correlation, which implies the responding time of the river discharge to the change of other hydrological parameters. The estimated time lags were integrated into the multivariate regression model. The results show that, for the river discharge data, precipitation, soil moisture, and surface runoff have linear relationships while evapotranspiration, soil and air temperature, and specific humidity have non-linear relationships. The observed river discharge and the predicted one, which is a function of precipitation and soil moisture, shows a good match with 93% of correlation.
Multivariate power-law models for streamflow prediction in the Mekong Basin
Directory of Open Access Journals (Sweden)
Guillaume Lacombe
2014-11-01
New hydrological insights for the region: A combination of 3–6 explanatory variables – chosen among annual rainfall, drainage area, perimeter, elevation, slope, drainage density and latitude – is sufficient to predict a range of flow metrics with a prediction R-squared ranging from 84 to 95%. The inclusion of forest or paddy percentage coverage as an additional explanatory variable led to slight improvements in the predictive power of some of the low-flow models (lowest prediction R-squared = 89%. A physical interpretation of the model structure was possible for most of the resulting relationships. Compared to regional regression models developed in other parts of the world, this new set of equations performs reasonably well.
Scrutiny of Appropriate Model Error Specification in Multivariate Assimilation Framework using mHM
Rakovec, O.; Noh, S. J.; Kumar, R.; Samaniego, L. E.
2015-12-01
Reliable and accurate predictions of regional scale water fluxes and states is of great challenge to the scientific community. Several sectors of society (municipalities, agriculture, energy, etc.) may benefit from successful solutions to appropriately quantify uncertainties in hydro-meteorological prediction systems, with particular attention to extreme weather conditions.Increased availability and quality of near real-time data enables better understanding of predictive skill of forecasting frameworks. To address this issue, automatic model-observation integrations are required for appropriate model initializations. In this study, the effects of noise specification on the quality of hydrological forecasts is scrutinized via a data assimilation system. This framework has been developed by incorporating the mesoscale hydrologic model (mHM, {http://www.ufz.de/mhm) with particle filtering (PF) approach used for model state updating. In comparison with previous works, lag PF is considered to better account for the response times of internal hydrologic processes.The objective of this study is to assess the benefits of model state updating for prediction of water fluxes and states up to 3-month ahead forecast using particle filtering. The efficiency of this system is demonstrated in 10 large European basins. We evaluate the model skill for five assimilation scenarios using observed (1) discharge (Q); (2) MODIS evapotranspiration (ET); (3) GRACE terrestrial total water storage (TWS) anomaly; (4) ESA-CCI soil moisture; and (5) the combination of Q, ET, TWS, and SM in a hindcast experiment (2004-2010). The effects of error perturbations for both, the analysis and the forecasts are presented, and optimal trade-offs are discussed. While large perturbations are preferred for the analysis time step, steep deterioration is observed for longer lead times, for which more conservative error measures should be considered. From all the datasets, complementary GRACE TWS data together
Shareef, Muntadher A.; Toumi, Abdelmalek; Khenchaf, Ali
2014-10-01
Remote sensing is one of the most important tools for monitoring and assisting to estimate and predict Water Quality parameters (WQPs). The traditional methods used for monitoring pollutants are generally relied on optical images. In this paper, we present a new approach based on the Synthetic Aperture Radar (SAR) images which we used to map the region of interest and to estimate the WQPs. To achieve this estimation quality, the texture analysis is exploited to improve the regression models. These models are established and developed to estimate six common concerned water quality parameters from texture parameters extracted from Terra SAR-X data. In this purpose, the Gray Level Cooccurrence Matrix (GLCM) is used to estimate several regression models using six texture parameters such as contrast, correlation, energy, homogeneity, entropy and variance. For each predicted model, an accuracy value is computed from the probability value given by the regression analysis model of each parameter. In order to validate our approach, we have used tow dataset of water region for training and test process. To evaluate and validate the proposed model, we applied it on the training set. In the last stage, we used the fuzzy K-means clustering to generalize the water quality estimation on the whole of water region extracted from segmented Terra SAR-X image. Also, the obtained results showed that there are a good statistical correlation between the in situ water quality and Terra SAR-X data, and also demonstrated that the characteristics obtained by texture analysis are able to monitor and predicate the distribution of WQPs in large rivers with high accuracy.
DEFF Research Database (Denmark)
Baadsgaard, Mikkel; Nielsen, Jan Nygaard; Madsen, Henrik
2000-01-01
An econometric analysis of continuous-timemodels of the term structure of interest rates is presented. A panel of coupon bond prices with different maturities is used to estimate the embedded parameters of a continuous-discrete state space model of unobserved state variables: the spot interest rate......, the central tendency and stochastic volatility. Emphasis is placed on the particular class of exponential-affine term structure models that permits solving the bond pricing PDE in terms of a system of ODEs. It is assumed that coupon bond prices are contaminated by additive white noise, where the stochastic...
Research of Home Information Technology Adoption Model
Institute of Scientific and Technical Information of China (English)
Ao Shan; Ren Weiyin; Lin Peishan; Tang Shoulian
2008-01-01
The Information Technology at Home has caught the attention of various industries such as IT, Home Appliances, Communication, and Real Estate. Based on the information technology acceptance theories and family consumption behaviors theories, this study summarized and analyzed four key belief variables i.e. Perceived Value, Perceived Risk, Perceived Cost and Perceived Ease of Use, which influence the acceptance of home information technology. The study also summaries three groups of external variables. They axe social, industrial, and family influence factors. The social influence factors include Subjective Norm; the industry factors include the Unification of Home Information Technological Standards, the Perfection of Home Information Industry Value Chain, and the Competitiveness of Home Information Industry; and the family factors include Family Income, Family Life Cycle and Family Educational Level. The study discusses the relationship among these external variables and cognitive variables. The study provides Home Information Technology Acceptance Model based on the Technology Acceptance Model and the characteristics of home information technology consumption.
Kim, Sungduk; Chen, Ming-Hui; Ibrahim, Joseph G; Shah, Arvind K; Lin, Jianxin
2013-10-15
In this paper, we propose a class of Box-Cox transformation regression models with multidimensional random effects for analyzing multivariate responses for individual patient data in meta-analysis. Our modeling formulation uses a multivariate normal response meta-analysis model with multivariate random effects, in which each response is allowed to have its own Box-Cox transformation. Prior distributions are specified for the Box-Cox transformation parameters as well as the regression coefficients in this complex model, and the deviance information criterion is used to select the best transformation model. Because the model is quite complex, we develop a novel Monte Carlo Markov chain sampling scheme to sample from the joint posterior of the parameters. This model is motivated by a very rich dataset comprising 26 clinical trials involving cholesterol-lowering drugs where the goal is to jointly model the three-dimensional response consisting of low density lipoprotein cholesterol (LDL-C), high density lipoprotein cholesterol (HDL-C), and triglycerides (TG) (LDL-C, HDL-C, TG). Because the joint distribution of (LDL-C, HDL-C, TG) is not multivariate normal and in fact quite skewed, a Box-Cox transformation is needed to achieve normality. In the clinical literature, these three variables are usually analyzed univariately; however, a multivariate approach would be more appropriate because these variables are correlated with each other. We carry out a detailed analysis of these data by using the proposed methodology. Copyright © 2013 John Wiley & Sons, Ltd.
A multivariate multilevel approach to the modeling of accuracy and speed of test takers
Klein Entink, R.H.; Fox, J.P.; Linden, W.J. van der
2009-01-01
Response times on test items are easily collected in modern computerized testing. When collecting both (binary) responses and (continuous) response times on test items, it is possible to measure the accuracy and speed of test takers. To study the relationships between these two constructs, the model
Directory of Open Access Journals (Sweden)
Tao Gao
2014-01-01
Full Text Available Extreme precipitation is likely to be one of the most severe meteorological disasters in China; however, studies on the physical factors affecting precipitation extremes and corresponding prediction models are not accurately available. From a new point of view, the sensible heat flux (SHF and latent heat flux (LHF, which have significant impacts on summer extreme rainfall in Yangtze River basin (YRB, have been quantified and then selections of the impact factors are conducted. Firstly, a regional extreme precipitation index was applied to determine Regions of Significant Correlation (RSC by analyzing spatial distribution of correlation coefficients between this index and SHF, LHF, and sea surface temperature (SST on global ocean scale; then the time series of SHF, LHF, and SST in RSCs during 1967–2010 were selected. Furthermore, other factors that significantly affect variations in precipitation extremes over YRB were also selected. The methods of multiple stepwise regression and leave-one-out cross-validation (LOOCV were utilized to analyze and test influencing factors and statistical prediction model. The correlation coefficient between observed regional extreme index and model simulation result is 0.85, with significant level at 99%. This suggested that the forecast skill was acceptable although many aspects of the prediction model should be improved.
Advanced Multivariate Inversion Techniques for High Resolution 3D Geophysical Modeling
2011-09-01
distributed across surrounding tectonic plates . Though the resulting continent-scale maps possess less detail than local-scale group velocity maps...requirements with high confidence, the Air Force Technical Applications Center needs new and improved capabilities for analyzing regional seismic ...wave magnitude mbɜ) seismic events. For seismically active areas, inaccurate models can be corrected using the kriging methodology and, therefore
Directory of Open Access Journals (Sweden)
José Celaya-Padilla
2015-01-01
Full Text Available Mammography is the most common and effective breast cancer screening test. However, the rate of positive findings is very low, making the radiologic interpretation monotonous and biased toward errors. This work presents a computer-aided diagnosis (CADx method aimed to automatically triage mammogram sets. The method coregisters the left and right mammograms, extracts image features, and classifies the subjects into risk of having malignant calcifications (CS, malignant masses (MS, and healthy subject (HS. In this study, 449 subjects (197 CS, 207 MS, and 45 HS from a public database were used to train and evaluate the CADx. Percentile-rank (p-rank and z-normalizations were used. For the p-rank, the CS versus HS model achieved a cross-validation accuracy of 0.797 with an area under the receiver operating characteristic curve (AUC of 0.882; the MS versus HS model obtained an accuracy of 0.772 and an AUC of 0.842. For the z-normalization, the CS versus HS model achieved an accuracy of 0.825 with an AUC of 0.882 and the MS versus HS model obtained an accuracy of 0.698 and an AUC of 0.807. The proposed method has the potential to rank cases with high probability of malignant findings aiding in the prioritization of radiologists work list.
Payne, Beth A.; Groen, Henk; Ukah, U. Vivian; Ansermino, J. Mark; Bhutta, Zulfiqar; Grobman, William; Hall, David R.; Hutcheon, Jennifer A.; Magee, Laura A.; von Dadelszen, Peter
2015-01-01
Objective: To develop and internally validate a prognostic model for perinatal death that could guide community-based antenatal care of women with a hypertensive disorder of pregnancy (HDP) in low-resourced settings as part of a mobile health application. Study design: Using data from 1688 women (11
Two Phase Analysis of Ski Schools Customer Satisfaction: Multivariate Ranking and Cub Models
Directory of Open Access Journals (Sweden)
Rosa Arboretti
2014-06-01
Full Text Available Monitoring tourists' opinions is an important issue also for companies providing sport services. The aim of this paper was to apply CUB models and nonparametric permutation methods to a large customer satisfaction survey performed in 2011 in the ski schools of Alto Adige (Italy. The two-phase data processing was mainly aimed to: establish a global ranking of a sample of five ski schools, on the basis of satisfaction scores for several specific service aspects; to estimate specific components of the respondents’ evaluation process (feeling and uncertainty and to detect if customers’ characteristics affected these two components. With the application of NPC-Global ranking we obtained a ranking of the evaluated ski schools simultaneously considering satisfaction scores of several service’s aspects. CUB models showed which aspects and subgroups were less satisfied giving tips on how to improve services and customer satisfaction.
Directory of Open Access Journals (Sweden)
Bryan R Conroy
Full Text Available Multivariate decoding models are increasingly being applied to functional magnetic imaging (fMRI data to interpret the distributed neural activity in the human brain. These models are typically formulated to optimize an objective function that maximizes decoding accuracy. For decoding models trained on full-brain data, this can result in multiple models that yield the same classification accuracy, though some may be more reproducible than others--i.e. small changes to the training set may result in very different voxels being selected. This issue of reproducibility can be partially controlled by regularizing the decoding model. Regularization, along with the cross-validation used to estimate decoding accuracy, typically requires retraining many (often on the order of thousands of related decoding models. In this paper we describe an approach that uses a combination of bootstrapping and permutation testing to construct both a measure of cross-validated prediction accuracy and model reproducibility of the learned brain maps. This requires re-training our classification method on many re-sampled versions of the fMRI data. Given the size of fMRI datasets, this is normally a time-consuming process. Our approach leverages an algorithm called fast simultaneous training of generalized linear models (FaSTGLZ to create a family of classifiers in the space of accuracy vs. reproducibility. The convex hull of this family of classifiers can be used to identify a subset of Pareto optimal classifiers, with a single-optimal classifier selectable based on the relative cost of accuracy vs. reproducibility. We demonstrate our approach using full-brain analysis of elastic-net classifiers trained to discriminate stimulus type in an auditory and visual oddball event-related fMRI design. Our approach and results argue for a computational approach to fMRI decoding models in which the value of the interpretation of the decoding model ultimately depends upon optimizing a
Mfumu Kihumba, Antoine; Vanclooster, Marnik
2013-04-01
Drinking water in Kinshasa, the capital of the Democratic Republic of Congo, is provided by extracting groundwater from the local aquifer, particularly in peripheral areas. The exploited groundwater body is mainly unconfined and located within a continuous detrital aquifer, primarily composed of sedimentary formations. However, the aquifer is subjected to an increasing threat of anthropogenic pollution pressure. Understanding the detailed origin of this pollution pressure is important for sustainable drinking water management in Kinshasa. The present study aims to explain the observed nitrate pollution problem, nitrate being considered as a good tracer for other pollution threats. The analysis is made in terms of physical attributes that are readily available using a statistical modelling approach. For the nitrate data, use was made of a historical groundwater quality assessment study, for which the data were re-analysed. The physical attributes are related to the topography, land use, geology and hydrogeology of the region. Prior to the statistical modelling, intrinsic and specific vulnerability for nitrate pollution was assessed. This vulnerability assessment showed that the alluvium area in the northern part of the region is the most vulnerable area. This area consists of urban land use with poor sanitation. Re-analysis of the nitrate pollution data demonstrated that the spatial variability of nitrate concentrations in the groundwater body is high, and coherent with the fragmented land use of the region and the intrinsic and specific vulnerability maps. For the statistical modeling use was made of multiple regression and regression tree analysis. The results demonstrated the significant impact of land use variables on the Kinshasa groundwater nitrate pollution and the need for a detailed delineation of groundwater capture zones around the monitoring stations. Key words: Groundwater , Isotopic, Kinshasa, Modelling, Pollution, Physico-chemical.
Energy Technology Data Exchange (ETDEWEB)
Bortolet, P.
1998-12-11
During these last two decades, the growing awareness of the contribution of the automobile to the degradation of the environment has forces different figures from the transportation world to put automobiles under more and more severe controls. Fuzzy logic is a technique which allows for the taking into account of experts knowledge; the most recent research work has moreover shown interest in associating fuzzy logic with algorithmic control techniques (adaptive control, robust control...). Our research work can be broken down into three distinct parts: a theoretical approach concerning the methods of fuzzy modeling permitting one to achieve models of the type Takagi-Sugeno and to use them in the synthesis of controls; the work of physical modeling of a four-stroke direct injection gas motor in collaboration with the development teams from Siemens Automotive SA; the simulated application of fuzzy modeling techniques and of fuzzy control developed on a theoretical level to a four-stroke direct injection gas motor. (author) 105 refs.
Schlemm, Eckhard; 10.3150/10-BEJ329
2012-01-01
The class of multivariate L\\'{e}vy-driven autoregressive moving average (MCARMA) processes, the continuous-time analogs of the classical vector ARMA processes, is shown to be equivalent to the class of continuous-time state space models. The linear innovations of the weak ARMA process arising from sampling an MCARMA process at an equidistant grid are proved to be exponentially completely regular ($\\beta$-mixing) under a mild continuity assumption on the driving L\\'{e}vy process. It is verified that this continuity assumption is satisfied in most practically relevant situations, including the case where the driving L\\'{e}vy process has a non-singular Gaussian component, is compound Poisson with an absolutely continuous jump size distribution or has an infinite L\\'{e}vy measure admitting a density around zero.
Directory of Open Access Journals (Sweden)
Manuel Sousa Gabrie
2014-09-01
Full Text Available This study analyzed market risk of an international investment portfolio by means of a new methodological proposal based on Value-at- Risk, using the covariance matrix of multivariate GARCH-type models and the extreme value theory to realize if an international diversification strategy minimizes market risk, and to determine if the VaR methodology adequately captures market risk, by applying Backtesting tests. To this end, we considered twelve international stock indexes, accounting for about 62% of the world stock market capitalization, and chose the period from the Dot-Com crisis to the current global financial crisis. Results show that the proposed methodology is a good alternative to accommodate the high market turbulence and can be considered as an adequate portfolio risk management instrument.
Choi, Eunhee; Tang, Fengyan; Kim, Sung-Geun; Turk, Phillip
2016-10-01
This study examined the longitudinal relationships between functional health in later years and three types of productive activities: volunteering, full-time, and part-time work. Using the data from five waves (2000-2008) of the Health and Retirement Study, we applied multivariate latent growth curve modeling to examine the longitudinal relationships among individuals 50 or over. Functional health was measured by limitations in activities of daily living. Individuals who volunteered, worked either full time or part time exhibited a slower decline in functional health than nonparticipants. Significant associations were also found between initial functional health and longitudinal changes in productive activity participation. This study provides additional support for the benefits of productive activities later in life; engagement in volunteering and employment are indeed associated with better functional health in middle and old age.
Zheng, Qiang; Li, Honglun; Fan, Baode; Wu, Shuanhu; Xu, Jindong
2017-09-01
Active contour model (ACM) has been one of the most widely utilized methods in magnetic resonance (MR) brain image segmentation because of its ability of capturing topology changes. However, most of the existing ACMs only consider single-slice information in MR brain image data, i.e., the information used in ACMs based segmentation method is extracted only from one slice of MR brain image, which cannot take full advantage of the adjacent slice images' information, and cannot satisfy the local segmentation of MR brain images. In this paper, a novel ACM is proposed to solve the problem discussed above, which is based on multi-variate local Gaussian distribution and combines the adjacent slice images' information in MR brain image data to satisfy segmentation. The segmentation is finally achieved through maximizing the likelihood estimation. Experiments demonstrate the advantages of the proposed ACM over the single-slice ACM in local segmentation of MR brain image series.
Directory of Open Access Journals (Sweden)
S. Sparnocchia
Full Text Available Multivariate vertical Empirical Orthogonal Functions (EOF are calculated for the entire Mediterranean Sea both from observations and model simulations, in order to find the optimal number of vertical modes to represent the upper thermocline vertical structure. For the first time, we show that the large-scale Mediterranean thermohaline vertical structure can be represented by a limited number of vertical multivariate EOFs, and that the "optimal set" can be selected on the basis of general principles. In particular, the EOFs are calculated for the combined temperature and salinity statistics, dividing the Mediterranean Sea into 9 regions and grouping the data seasonally. The criterion used to establish whether a reduced set of EOFs is optimal is based on the analysis of the root mean square residual error between the original data and the profiles reconstructed by the reduced set of EOFs. It was found that the number of EOFs needed to capture the variability contained in the original data changes with geographical region and seasons. In particular, winter data require a smaller number of modes (4–8, depending on the region than the other seasons (8–9 in summer. Moreover, western Mediterranean regions require more modes than the eastern Mediterranean ones, but this result may depend on the data scarcity in the latter regions. The EOFs computed from the in situ data set are compared to those calculated using data obtained from a model simulation. The main results of this exercise are that the two groups of modes are not strictly comparable but their ability to reproduce observations is the same. Thus, they may be thought of as equivalent sets of basis functions, upon which to project the thermohaline variability of the basin.
Key words. Oceanography: general (water masses – Oceanography: physical (hydrography; instruments and techniques
A multi-variable box model approach to the soft tissue carbon pump
Directory of Open Access Journals (Sweden)
A. M. de Boer
2010-12-01
Full Text Available The canonical question of which physical, chemical or biological mechanisms were responsible for oceanic uptake of atmospheric CO_{2} during the last glacial is yet unanswered. Insight from paleo-proxies has led to a multitude of hypotheses but none so far have been convincingly supported in three dimensional numerical modelling experiments. The processes that influence the CO_{2} uptake and export production are inter-related and too complex to solve conceptually while complex numerical models are time consuming and expensive to run which severely limits the combinations of mechanisms that can be explored. Instead, an intermediate inverse box model approach of the soft tissue pump is used here in which the whole parameter space is explored. The glacial circulation and biological production states are derived from these using proxies of glacial export production and the need to draw down CO_{2} into the ocean. We find that circulation patterns which explain glacial observations include reduced Antarctic Bottom Water formation and high latitude upwelling and mixing of deep water and to a lesser extent reduced equatorial upwelling. The proposed mechanism of CO_{2} uptake by an increase of eddies in the Southern Ocean, leading to a reduced residual circulation, is not supported. Regarding biological mechanisms, an increase in the nutrient utilization in either the equatorial regions or the northern polar latitudes can reduce atmospheric CO_{2} and satisfy proxies of glacial export production. Consistent with previous studies, CO_{2} is drawn down more easily through increased productivity in the Antarctic region than the sub-Antarctic, but that violates observations of lower export production there. The glacial states are more sensitive to changes in the circulation and less sensitive to changes in nutrient utilization rates than the interglacial states.
Multivariate explanatory model for sporadic carcinoma of the colon in Dukes' stages I and IIa
Directory of Open Access Journals (Sweden)
J.M. Villadiego-Sánchez, M. Ortega-Calvo, R. Pino-Mejías, A. Cayuela, P. Iglesias-Bonilla, F. García-de la Corte, J.M. Santos-Lozano, José Lapetra-Peralta
2009-01-01
Full Text Available Objective: We obtained before an explanatory model with six dependant variables: age of the patient, total cholesterol (TC, HDL cholesterol (HDL-C, VLDL cholesterol (VLDL-C, alkaline phosphatase (AP and the CA 19.9 tumour marker. Our objective in this study was to validate the model by means of the acquisition of new records for an additional analysis. Design: Non-paired case control study. Setting: Urban and rural hospitals and primary health facilities in Western Andalusia and Extremadura (Spain. Patients: At both the primary care facilities and hospital level, controls were gathered in a prospective manner (n= 275. Cases were prospective and retrospective manner collected on (n=126. Main outcome measures: Descriptive statistics, logistic regression and bootstrap analysis. Results: The AGE (odds ratio 1.02; 95% CI 1.003-1.037 (p= 0.01, the TC (odds ratio 0.986; 95% C.I. 0.980-0.992 (p< 0.001 and the CA 19.9 (odds ratio 1.023; 95% C.I. 1.012- 1.034 (p<0.001 were the variables that showed significant values at logistic regression analysis and bootstrap. Berkson's bias was statistically assessed. Conclusions: The model, validated by means of logistic regression and bootstrap analysis, contains the variables AGE, TC, and CA 19.9 (three of the original six and has a level 4 over 5 according to the criteria of Justice et al. (multiple independent validations [Ann. Intern. Med.1999; 130: 515].
Directory of Open Access Journals (Sweden)
D. P. Siu
2011-01-01
Full Text Available In this work, a class of multidimensional stochastic hybrid dynamic models is studied. The system under investigation is a first-order linear nonhomogeneous system of Itô-Doob type stochastic differential equations with switching coefficients. The switching of the system is governed by a discrete dynamic which is monitored by a non-homogeneous Poisson process. Closed-form solutions of the systems are obtained. Furthermore, the major part of the work is devoted to finding closed-form probability density functions of the solution processes of linear homogeneous and Ornstein-Uhlenbeck type systems with jumps.
A Model-Free Method for Structual Change Detection Multivariate Nonlinear Time Series
Institute of Scientific and Technical Information of China (English)
孙青华; 张世英; 梁雄健
2003-01-01
In this paper, we apply the recursive genetic programming (RGP) approach to the cognition of a system, and then proceed to the detecting procedure for structural changes in the system whose components are of long memory. This approach is adaptive and model-free, which can simulate the individual activities of the system's participants, therefore, it has strong ability to recognize the operating mechanism of the system. Based on the previous cognition about the system, a testing statistic is developed for the detection of structural changes in the system. Furthermore, an example is presented to illustrate the validity and practical value of the proposed.
Directory of Open Access Journals (Sweden)
Kehinde Anthony Mogaji
2016-07-01
Full Text Available This study developed a GIS-based multivariate regression (MVR yield rate prediction model of groundwater resource sustainability in the hard-rock geology terrain of southwestern Nigeria. This model can economically manage the aquifer yield rate potential predictions that are often overlooked in groundwater resources development. The proposed model relates the borehole yield rate inventory of the area to geoelectrically derived parameters. Three sets of borehole yield rate conditioning geoelectrically derived parameters—aquifer unit resistivity (ρ, aquifer unit thickness (D and coefficient of anisotropy (λ—were determined from the acquired and interpreted geophysical data. The extracted borehole yield rate values and the geoelectrically derived parameter values were regressed to develop the MVR relationship model by applying linear regression and GIS techniques. The sensitivity analysis results of the MVR model evaluated at P ⩽ 0.05 for the predictors ρ, D and λ provided values of 2.68 × 10−05, 2 × 10−02 and 2.09 × 10−06, respectively. The accuracy and predictive power tests conducted on the MVR model using the Theil inequality coefficient measurement approach, coupled with the sensitivity analysis results, confirmed the model yield rate estimation and prediction capability. The MVR borehole yield prediction model estimates were processed in a GIS environment to model an aquifer yield potential prediction map of the area. The information on the prediction map can serve as a scientific basis for predicting aquifer yield potential rates relevant in groundwater resources sustainability management. The developed MVR borehole yield rate prediction mode provides a good alternative to other methods used for this purpose.
Bonsu, Bema K; Harper, Marvin B
2004-06-01
Although accurate models for predicting acute bacterial meningitis exist, most have narrow application because of the specific variables selected for them. In this study, we estimate the accuracy of a simple new model with potentially broader applicability. On the basis of previous reports, we created a reduced multivariable logistic regression model for predicting bacterial meningitis that relies on age (years) (AGE), cerebrospinal fluid (CSF), total protein (TP) and total neutrophil count (TNC) alone. Data were from children ages 1 month-18 years diagnosed with acute enteroviral or bacterial meningitis whose initial CSF revealed >7 white blood cells/mm. A fractional polynomial model was specified and validated internally by the bootstrap procedure. The area under the receiver operating characteristic curve (discrimination: criterion standard, >0.7), the Hosmer-Lemeshow deciles-of-risk statistic (calibration: criterion standard, P > 0.05) and sensitivity-specificity pairs at prespecified probability thresholds of the model were computed. We identified 60 children with bacterial meningitis and 82 with enteroviral meningitis. At an area under the receiver operating characteristic curve of 0.97, our model represented by the equation: log odds of bacterial meningitis = 0.343 - 0.003 TNC - 34.802 TP + 21.991 TP - 0.345 AGE, was highly accurate when differentiating between bacterial and enteroviral meningitis. The model fit the data well (Hosmer-Lemeshow statistic; P =[r] 0.53). At probability cutoffs between 0.1 and 0.4, the model had sensitivity values between 98 and 92% and specificity values between 62 and 94%. Among children with CSF pleocytosis, a prediction model based exclusively on age, CSF total protein and CSF neutrophils differentiates accurately between acute bacterial and viral meningitis.
Development of a Mathematical Model for Multivariate Process by Balanced Six Sigma
Directory of Open Access Journals (Sweden)
Díaz-Castellanos Elizabeth Eugenia
2015-07-01
Full Text Available The Six Sigma methodology is widely used in business to improve quality, increase productivity and lower costs, impacting on business improvement. However, today the challenge is to use those tools for improvements that will have a direct impact on the differentiation of value, which requires the alignment of Six Sigma with the competitive strategies of the organization.Hence the importance of a strategic management system to measure, analyze, improve and control corporate performance, while setting out responsibilities of leadership and commitment. The specific purpose of this research is to provide a mathematical model through the alignment of strategic objectives (Balanced Scorecard and tools for productivity improvement (Six Sigma for processes with multiple answers, which is sufficiently robust so that it can serve as basis for application in manufacturing and thus effectively link strategy performance and customer satisfaction. Specifically we worked with a case study: Córdoba, Ver. The model proposes that is the strategy, performance and customer satisfaction are aligned, the organization will benefit from the intense relationship between process performance and strategic initiatives. These changes can be measured by productivity and process metrics such as cycle time, production rates, production efficiency and percentage of reprocessing, among others.
Bayesian spatial modeling of disease risk in relation to multivariate environmental risk fields.
Kim, Ji-in; Lawson, Andrew B; McDermott, Suzanne; Aelion, C Marjorie
2010-01-15
The relationship between exposure to environmental chemicals during pregnancy and early childhood development is an important issue that has a spatial risk component. In this context, we have examined mental retardation and developmental delay (MRDD) outcome measures for children in a Medicaid population in South Carolina and sampled measures of soil chemistry (e.g. As, Hg, etc.) on a network of sites that are misaligned to the outcome residential addresses during pregnancy. The true chemical concentration at the residential addresses is not observed directly and must be interpolated from soil samples. In this study, we have developed a Bayesian joint model that interpolates soil chemical fields and estimates the associated MRDD risk simultaneously. Having multiple spatial fields to interpolate, we have considered a low-rank Kriging method for the interpolation that requires less computation than the Bayesian Kriging. We performed a sensitivity analysis for a bivariate smoothing, changing the number of knots and the smoothing parameter. These analyses show that a low-rank Kriging method can be used as an alternative to a full-rank Kriging, reducing the computational burden. However, the number of knots for the low-rank Kriging model needs to be selected with caution as a bivariate surface estimation can be sensitive to the choice of the number of knots.
Multivariate modelling of storm characteristics on the basis of copulas in Switzerland
Gaal, L.; Molnar, P.; Szolgay, J.
2012-04-01
Copula-based estimation methods of hydro-climatological extremes have increasingly been gaining attention of researchers and practitioners in the last decade. Unlike the traditional estimation methods which are based on bivariate cumulative distribution functions, copulas are a relatively flexible tool which allows for modelling dependencies between two or more variables without making strict assumptions on their marginal distributions. This study focuses on the analysis of the interdependence of critical storm properties (such as rainfall duration, intensity and total rainfall amount) in Switzerland. The data base for the analysis consists of rainfall records with 10 minute resolution, which are available at about 70 SwissMetNet stations and span the period of observations of 26 years. The storm characteristics are estimated from this data both on an annual and seasonal basis. First, the storm variable combinations are analyzed in terms of their distribution functions. Then, the choice of the copula functions that best fit the data is carried out. The cornerstone of the study is an analysis of seasonal and spatial differences that appear in the patterns of the copula parameters and the dependence models. It is attempted to relate the dependence characteristics to the dominant generating mechanisms of precipitation as well as to climatological factors. The aim of the study is to contribute to our understanding of the spatial and seasonal variability of dependence characteristics of storm properties in an orographically complex environment.
Technology Transfer: A Policy Model
1988-04-01
34 Caveman Club-Without Nail." More serious scholars indicate that understand- ing how to start and maintain fires was the first tech- nology transfer of...others. From caveman clubs to hyper- velocity missiles, technology transfer has played a significant military role; it also has assisted imperialis- tic
Sabb, F W; Burggren, A C; Higier, R G; Fox, J; He, J; Parker, D S; Poldrack, R A; Chu, W; Cannon, T D; Freimer, N B; Bilder, R M
2009-11-24
Refining phenotypes for the study of neuropsychiatric disorders is of paramount importance in neuroscience. Poor phenotype definition provides the greatest obstacle for making progress in disorders like schizophrenia, bipolar disorder, Attention Deficit/Hyperactivity Disorder (ADHD), and autism. Using freely available informatics tools developed by the Consortium for Neuropsychiatric Phenomics (CNP), we provide a framework for defining and refining latent constructs used in neuroscience research and then apply this strategy to review known genetic contributions to memory and intelligence in healthy individuals. This approach can help us begin to build multi-level phenotype models that express the interactions between constructs necessary to understand complex neuropsychiatric diseases. These results are available online through the http://www.phenowiki.org database. Further work needs to be done in order to provide consensus-building applications for the broadly defined constructs used in neuroscience research.
Accounting for sex differences in PTSD: A multi-variable mediation model
DEFF Research Database (Denmark)
Christiansen, Dorte M.; Hansen, Maj
2015-01-01
ABSTRACT Background: Approximately twice as many females as males are diagnosed with posttraumatic stress disorder (PTSD). However, little is known about why females report more PTSD symptoms than males. Prior studies have generally focused on few potential mediators at a time and have often used...... specifically to test a multiple mediator model. Results: Females reported more PTSD symptoms than males and higher levels of neuroticism, depression, physical anxiety sensitivity, peritraumatic fear, horror, and helplessness (the A2 criterion), tonic immobility, panic, dissociation, negative posttraumatic...... that females report more PTSD symptoms because they experience higher levels of associated risk factors. The results are relevant to other trauma populations and to other trauma- related psychiatric disorders more prevalent in females, such as depression and anxiety. Keywords: Posttraumatic stress disorder...
Yousif, Wael K.
2010-01-01
This causal and correlational study was designed to extend the Technology Acceptance Model (TAM) and to test its applicability to Valencia Community College (VCC) Engineering and Technology students as the target user group when investigating the factors influencing their decision to adopt and to utilize VMware as the target technology. In…
Zu, Theresah N. K.; Liu, Sanchao; Germane, Katherine L.; Servinsky, Matthew D.; Gerlach, Elliot S.; Mackie, David M.; Sund, Christian J.
2016-05-01
The coupling of optical fibers with Raman instrumentation has proven to be effective for real-time monitoring of chemical reactions and fermentations when combined with multivariate statistical data analysis. Raman spectroscopy is relatively fast, with little interference from the water peak present in fermentation media. Medical research has explored this technique for analysis of mammalian cultures for potential diagnosis of some cancers. Other organisms studied via this route include Escherichia coli, Saccharomyces cerevisiae, and some Bacillus sp., though very little work has been performed on Clostridium acetobutylicum cultures. C. acetobutylicum is a gram-positive anaerobic bacterium, which is highly sought after due to its ability to use a broad spectrum of substrates and produce useful byproducts through the well-known Acetone-Butanol-Ethanol (ABE) fermentation. In this work, real-time Raman data was acquired from C. acetobutylicum cultures grown on glucose. Samples were collected concurrently for comparative off-line product analysis. Partial-least squares (PLS) models were built both for agitated cultures and for static cultures from both datasets. Media components and metabolites monitored include glucose, butyric acid, acetic acid, and butanol. Models were cross-validated with independent datasets. Experiments with agitation were more favorable for modeling with goodness of fit (QY) values of 0.99 and goodness of prediction (Q2Y) values of 0.98. Static experiments did not model as well as agitated experiments. Raman results showed the static experiments were chaotic, especially during and shortly after manual sampling.
Directory of Open Access Journals (Sweden)
Paulino José García Nieto
2015-06-01
Full Text Available The aim of this study was to obtain a predictive model able to perform an early detection of central segregation severity in continuous cast steel slabs. Segregation in steel cast products is an internal defect that can be very harmful when slabs are rolled in heavy plate mills. In this research work, the central segregation was studied with success using the data mining methodology based on multivariate adaptive regression splines (MARS technique. For this purpose, the most important physical-chemical parameters are considered. The results of the present study are two-fold. In the first place, the significance of each physical-chemical variable on the segregation is presented through the model. Second, a model for forecasting segregation is obtained. Regression with optimal hyperparameters was performed and coefficients of determination equal to 0.93 for continuity factor estimation and 0.95 for average width were obtained when the MARS technique was applied to the experimental dataset, respectively. The agreement between experimental data and the model confirmed the good performance of the latter.
Hadjipantelis, P Z; Aston, J A D; Müller, H G; Evans, J P
2015-04-03
Mandarin Chinese is characterized by being a tonal language; the pitch (or F0) of its utterances carries considerable linguistic information. However, speech samples from different individuals are subject to changes in amplitude and phase, which must be accounted for in any analysis that attempts to provide a linguistically meaningful description of the language. A joint model for amplitude, phase, and duration is presented, which combines elements from functional data analysis, compositional data analysis, and linear mixed effects models. By decomposing functions via a functional principal component analysis, and connecting registration functions to compositional data analysis, a joint multivariate mixed effect model can be formulated, which gives insights into the relationship between the different modes of variation as well as their dependence on linguistic and nonlinguistic covariates. The model is applied to the COSPRO-1 dataset, a comprehensive database of spoken Taiwanese Mandarin, containing approximately 50,000 phonetically diverse sample F0 contours (syllables), and reveals that phonetic information is jointly carried by both amplitude and phase variation. Supplementary materials for this article are available online.
Sasakura, D; Nakayama, K; Sakamoto, T; Chikuma, T
2015-05-01
The use of transmission near infrared spectroscopy (TNIRS) is of particular interest in the pharmaceutical industry. This is because TNIRS does not require sample preparation and can analyze several tens of tablet samples in an hour. It has the capability to measure all relevant information from a tablet, while still on the production line. However, TNIRS has a narrow spectrum range and overtone vibrations often overlap. To perform content uniformity testing in tablets by TNIRS, various properties in the tableting process need to be analyzed by a multivariate prediction model, such as a Partial Least Square Regression modeling. One issue is that typical approaches require several hundred reference samples to act as the basis of the method rather than a strategically designed method. This means that many batches are needed to prepare the reference samples; this requires time and is not cost effective. Our group investigated the concentration dependence of the calibration model with a strategic design. Consequently, we developed a more effective approach to the TNIRS calibration model than the existing methodology.
García Nieto, Paulino José; González Suárez, Victor Manuel; Álvarez Antón, Juan Carlos; Mayo Bayón, Ricardo; Sirgo Blanco, José Ángel; Díaz Fernández, Ana María
2015-01-01
The aim of this study was to obtain a predictive model able to perform an early detection of central segregation severity in continuous cast steel slabs. Segregation in steel cast products is an internal defect that can be very harmful when slabs are rolled in heavy plate mills. In this research work, the central segregation was studied with success using the data mining methodology based on multivariate adaptive regression splines (MARS) technique. For this purpose, the most important physical-chemical parameters are considered. The results of the present study are two-fold. In the first place, the significance of each physical-chemical variable on the segregation is presented through the model. Second, a model for forecasting segregation is obtained. Regression with optimal hyperparameters was performed and coefficients of determination equal to 0.93 for continuity factor estimation and 0.95 for average width were obtained when the MARS technique was applied to the experimental dataset, respectively. The agreement between experimental data and the model confirmed the good performance of the latter.
Reseach of Supply Chain Modeling Technology
Institute of Scientific and Technical Information of China (English)
无
2000-01-01
The supply chain modeling technology is research. Firstly, the concept of supply chain and supply chain management is introduced. Secondly, enterprise-modeling methods, such as CIM-OSA, GIM-GRAI, PERA and ARIS, are analyzed and compared. The supply chain modeling technology is studied. Then the ARIS-based supply chain modeling method is proposed and the supply chain operation reference model is set up. Finally, the applications of ARIS-based supply chain modeling method in Shanghai Turbine Generator Co. Ltd. (STGC) is described in detail.
Hyper-Fit: Fitting Linear Models to Multidimensional Data with Multivariate Gaussian Uncertainties
Robotham, A S G
2015-01-01
Astronomical data is often uncertain with errors that are heteroscedastic (different for each data point) and covariant between different dimensions. Assuming that a set of D-dimensional data points can be described by a (D - 1)-dimensional plane with intrinsic scatter, we derive the general likelihood function to be maximised to recover the best fitting model. Alongside the mathematical description, we also release the hyper-fit package for the R statistical language (github.com/asgr/hyper.fit) and a user-friendly web interface for online fitting (hyperfit.icrar.org). The hyper-fit package offers access to a large number of fitting routines, includes visualisation tools, and is fully documented in an extensive user manual. Most of the hyper-fit functionality is accessible via the web interface. In this paper we include applications to toy examples and to real astronomical data from the literature: the mass-size, Tully-Fisher, Fundamental Plane, and mass-spin-morphology relations. In most cases the hyper-fit ...
Jackson, Dan; White, Ian R; Riley, Richard D
2013-03-01
Multivariate meta-analysis is becoming more commonly used. Methods for fitting the multivariate random effects model include maximum likelihood, restricted maximum likelihood, Bayesian estimation and multivariate generalisations of the standard univariate method of moments. Here, we provide a new multivariate method of moments for estimating the between-study covariance matrix with the properties that (1) it allows for either complete or incomplete outcomes and (2) it allows for covariates through meta-regression. Further, for complete data, it is invariant to linear transformations. Our method reduces to the usual univariate method of moments, proposed by DerSimonian and Laird, in a single dimension. We illustrate our method and compare it with some of the alternatives using a simulation study and a real example. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Feature Technology in Product Modeling
Institute of Scientific and Technical Information of China (English)
ZHANG Xu; NING Ruxin
2006-01-01
A unified feature definition is proposed. Feature is form-concentrated, and can be used to model product functionalities, assembly relations, and part geometries. The feature model is given and a feature classification is introduced including functional, assembly, structural, and manufacturing features. A prototype modeling system is developed in Pro/ENGINEER that can define the assembly and user-defined form features.
[Influence of migration background on child development at school-enrolment - a multivariate model].
Oberwöhrmann, S; Bettge, S; Hermann, S; Meinlschmidt, G
2013-04-01
There is an increasing awareness for the role of a migration background regarding health over the past years in Germany. Descriptive data show that children from families with a migration background score significantly lower in developmental screening tests at school-enrolment compared to their peers of German origin. The analyses presented here examine the impact of a migration background on child development in the context of additional factors of influence. Data are from the routine examina-tion at school-enrolment in Berlin in 2010 and 2011 (N=54 818). Because of the multicollinearity of migration background and the German language skills of the child and its parents these variables are combined to one variable. Multiple regression models are conducted with 'poor performance in 2 or more developmental domains' as the dependent variable and the migration variables as the independent variable controlled for sociodemographic and other relevant predictor variables. The strongest predictor variable is the socioeconomic status of the family (OR 5.8). A migration background is only a predictor in combination with insufficient German language skills of child or parent (OR 1.6) and insufficient German language skills of child and parent (OR 5.3) respectively. Furthermore, very low birth weight children (birth weight performance in 2 or more developmental domains (OR 4.2). Having spent not more than 2 years in day care (OR 1.6), living with a single parent and missing the preventive health check-up at the age of 4 (so called U8) have only a weak significant impact (OR 1.2 each). Electronic media exposure (television, computer) is no significant risk factor in our analyses. The analyses show that migration background is not a risk factor for poor performance in developmental tests per se, but is attributed to the higher proportion of families with a low socioeconomic status in this group and with insufficient German language skills. This emphasizes the importance of
Directory of Open Access Journals (Sweden)
Alexander Vladimirovich Kirillov
2015-12-01
Full Text Available The international integration of the Russian economy is connected to the need of the realization of the competitive advantages of the geopolitical position of Russia, the industrial potential of regions, the logistic infrastructure of transport corridors. This article discusses the design model of the supply chain (distribution network based on the multivariate analysis and the methodology of the substantiation of its configuration based on the cost factors and the level of the logistics infrastructure development. For solving the problem of placing one or more logistics centers in the service area, a two-stage algorithm is used. At the first stage, the decisions on the reasonability of the choice of one or another version of the development are made with А. В. Кириллов, В. Е. Целин 345 ЭКОНОМИКА РЕГИОНА №4 (2015 the use of the “Make or Buy” standard model. The criterion of decision making is the guaranteed overcoming of the threshold of “indifference” taking into account the statistical characteristics of costs for options of “buy” and “make” depending on the volume of consumption of goods or services. At the second stage, the Ardalan’s heuristic method is used for the evaluation of the choice of placing one or more logistics centers in the service area. The model parameters are based on the assessment of the development prospects of the region and its investment potential (existence and composition of employment, production, natural resources, financial and consumer opportunities, institutional, innovation, infrastructure capacity. Furthermore, such criteria as a regional financial appeal, professionally trained specialists, the competitive advantages of the promoted company and others are analyzed. An additional criterion is the development of the priority matrix, which considers such factors as difficulties of customs registration and certification, a level of regional transport
Keppenne, Christian L.; Rienecker, Michele M.; Koblinsky, Chester (Technical Monitor)
2001-01-01
A multivariate ensemble Kalman filter (MvEnKF) implemented on a massively parallel computer architecture has been implemented for the Poseidon ocean circulation model and tested with a Pacific Basin model configuration. There are about two million prognostic state-vector variables. Parallelism for the data assimilation step is achieved by regionalization of the background-error covariances that are calculated from the phase-space distribution of the ensemble. Each processing element (PE) collects elements of a matrix measurement functional from nearby PEs. To avoid the introduction of spurious long-range covariances associated with finite ensemble sizes, the background-error covariances are given compact support by means of a Hadamard (element by element) product with a three-dimensional canonical correlation function. The methodology and the MvEnKF configuration are discussed. It is shown that the regionalization of the background covariances; has a negligible impact on the quality of the analyses. The parallel algorithm is very efficient for large numbers of observations but does not scale well beyond 100 PEs at the current model resolution. On a platform with distributed memory, memory rather than speed is the limiting factor.
Botsis, Taxiarchis; Anagnostou, Valsamo K; Hartvigsen, Gunnar; Hripcsak, George; Weng, Chunhua
2010-01-01
OBJECTIVE: Current staging systems are not accurate for classifying pancreatic endocrine tumors (PETs) by risk. Here, we developed a prognostic model for PETs and compared it to the WHO classification system. METHODS: We identified 98 patients diagnosed with PET at NewYork-Presbyterian Hospital/Columbia University Medical Center (1999 to 2009). Tumor and clinical characteristics were retrieved and associations with survival were assessed by univariate Cox analysis. A multivariable model was constructed and a risk score was calculated; the prognostic strength of our model was assessed with the concordance index. RESULTS: Our cohort had median age of 60 years and consisted of 61.2% women; median follow-up time was 10.4 months (range: 0.1-99.6) with a 5-year survival of 61.5%. The majority of PETs were non-functional and no difference was observed between functional and non-functional tumors with respect to WHO stage, age, pathologic characteristics or survival. Distant metastases, aspartate aminotransferase-AST and surgical resection (HR=3.39, 95% CI: 1.38-8.35, p=0.008, HR=3.73, 95% CI: 1.20-11.57, p=0.023 and HR=0.20, 95% CI: 0.08-0.51, pclinical decisions.
Alamdari, R F; Mani-Varnosfaderani, A; Asadollahi-Baboli, M; Khalafi-Nezhad, A
2012-10-01
The present work focuses on the development of an interpretable quantitative structure-activity relationship (QSAR) model for predicting the anti-HIV activities of 67 thiazolylthiourea derivatives. This set of molecules has been proposed as potent HIV-1 reverse transcriptase inhibitors (RT-INs). The molecules were encoded to a diverse set of molecular descriptors, spanning different physical and chemical properties. Monte Carlo (MC) sampling and multivariate adaptive regression spline (MARS) techniques were used to select the most important descriptors and to predict the activity of the molecules. The most important descriptor was found to be the aspherisity index. The analysis of variance (ANOVA) and interpretable spline equations showed that the geometrical shape of the molecules has considerable effect on their activities. It seems that the linear molecules are more active than symmetric top compounds. The final MARS model derived displayed a good predictive ability judging from the determination coefficient corresponding to the leave multiple out (LMO) cross-validation technique, i.e. r (2 )= 0.828 (M = 12) and r (2 )= 0.813 (M = 20). The results of this work showed that the developed spline model is robust, has a good predictive power, and can then be used as a reliable tool for designing novel HIV-1 RT-INs.
Technology Acceptance Model for Wireless Internet.
Lu, June; Yu, Chun-Sheng; Liu, Chang; Yao, James E.
2003-01-01
Develops a technology acceptance model (TAM) for wireless Internet via mobile devices (WIMD) and proposes that constructs, such as individual differences, technology complexity, facilitating conditions, social influences, and wireless trust environment determine user-perceived short and long-term usefulness, and ease of using WIMD. Twelve…
Strategies for Industrial Multivariable Control
DEFF Research Database (Denmark)
Hangstrup, M.
Multivariable control strategies well-suited for industrial applications are suggested. The strategies combine the practical advantages of conventional SISO control schemes and -technology with the potential of multivariable controllers. Special emphasis is put on parameter-varying systems whose...... dynamics and gains strongly depend upon one or more physical parameters characterizing the operating point. This class covers many industrial systems such as airplanes, ships, robots and process control systems. Power plant boilers are representatives for process control systems in general. The dynamics...
Business Model Discovery by Technology Entrepreneurs
Directory of Open Access Journals (Sweden)
Steven Muegge
2012-04-01
Full Text Available Value creation and value capture are central to technology entrepreneurship. The ways in which a particular firm creates and captures value are the foundation of that firm's business model, which is an explanation of how the business delivers value to a set of customers at attractive profits. Despite the deep conceptual link between business models and technology entrepreneurship, little is known about the processes by which technology entrepreneurs produce successful business models. This article makes three contributions to partially address this knowledge gap. First, it argues that business model discovery by technology entrepreneurs can be, and often should be, disciplined by both intention and structure. Second, it provides a tool for disciplined business model discovery that includes an actionable process and a worksheet for describing a business model in a form that is both concise and explicit. Third, it shares preliminary results and lessons learned from six technology entrepreneurs applying a disciplined process to strengthen or reinvent the business models of their own nascent technology businesses.
Multivariate Statistical Process Control
DEFF Research Database (Denmark)
Kulahci, Murat
2013-01-01
As sensor and computer technology continues to improve, it becomes a normal occurrence that we confront with high dimensional data sets. As in many areas of industrial statistics, this brings forth various challenges in statistical process control (SPC) and monitoring for which the aim...... is to identify “out-of-control” state of a process using control charts in order to reduce the excessive variation caused by so-called assignable causes. In practice, the most common method of monitoring multivariate data is through a statistic akin to the Hotelling’s T2. For high dimensional data with excessive...
Causal Models for Safety Assurance Technologies Project
National Aeronautics and Space Administration — Fulfillment of NASA's System-Wide Safety and Assurance Technology (SSAT) project at NASA requires leveraging vast amounts of data into actionable knowledge. Models...
A merge model with endogenous technological change
Energy Technology Data Exchange (ETDEWEB)
Kypreos, S.; Bahn, O.
2002-03-01
A new version of the MERGE model, called MERGE-ETL, has been developed to consider endogenous technological change in the energy system. The basic formulation of MERGE-ETL as well as some first results are reported here. (author)
Märk, Julia; Andre, Max; Karner, Martin; Huck, Christian W
2010-10-01
NIR spectroscopy was applied to develop a fast and reliable quality control system for a pharmaceutical substance to support information obtained through PAT surveillance of its manufacturing process. After calculating different quantitative calibrations of the substance's key quality parameters, a general classification model has been derived to capture the over-all product grade. The final spectral quality conformity model consisting of 96 representative batches - covering high process variability - was sensibilized toward five important quality parameters by their incorporation as PLS responses. The model characteristics were extensively investigated and interpreted to derive a reasonable limit for the reduced chemometric summary quality measure (Hotteling's T(2)). Through this parameter new batches can be assessed easily by their NIR spectra, using versatile test batches for confirmation. Different sets of good quality batches, bad production batches beyond the respective chemical quality limit and synthetic batches exactly at the limit could be accurately assigned through their multivariate evaluation to a large extend. However, high model sensitivity to non-relevant product properties can lead to limited applicability of the model. This may be caused by restricted bandwidth of quality parameters in production environment for calibration, repack effects and high process instability.
Global Health Innovation Technology Models
Directory of Open Access Journals (Sweden)
Kimberly Harding
2016-04-01
Full Text Available Chronic technology and business process disparities between High Income, Low Middle Income and Low Income (HIC, LMIC, LIC research collaborators directly prevent the growth of sustainable Global Health innovation for infectious and rare diseases. There is a need for an Open Source-Open Science Architecture Framework to bridge this divide. We are proposing such a framework for consideration by the Global Health community, by utilizing a hybrid approach of integrating agnostic Open Source technology and healthcare interoperability standards and Total Quality Management principles. We will validate this architecture framework through our programme called Project Orchid. Project Orchid is a conceptual Clinical Intelligence Exchange and Virtual Innovation platform utilizing this approach to support clinical innovation efforts for multi-national collaboration that can be locally sustainable for LIC and LMIC research cohorts. The goal is to enable LIC and LMIC research organizations to accelerate their clinical trial process maturity in the field of drug discovery, population health innovation initiatives and public domain knowledge networks. When sponsored, this concept will be tested by 12 confirmed clinical research and public health organizations in six countries. The potential impact of this platform is reduced drug discovery and public health innovation lag time and improved clinical trial interventions, due to reliable clinical intelligence and bio-surveillance across all phases of the clinical innovation process.
Global Health Innovation Technology Models
Directory of Open Access Journals (Sweden)
Kimberly Harding
2016-04-01
Full Text Available Chronic technology and business process disparities between High Income, Low Middle Income and Low Income (HIC, LMIC, LIC research collaborators directly prevent the growth of sustainable Global Health innova‐ tion for infectious and rare diseases. There is a need for an Open Source-Open Science Architecture Framework to bridge this divide. We are proposing such a framework for consideration by the Global Health community, by utiliz‐ ing a hybrid approach of integrating agnostic Open Source technology and healthcare interoperability standards and Total Quality Management principles. We will validate this architecture framework through our programme called Project Orchid. Project Orchid is a conceptual Clinical Intelligence Exchange and Virtual Innovation platform utilizing this approach to support clinical innovation efforts for multi-national collaboration that can be locally sustainable for LIC and LMIC research cohorts. The goal is to enable LIC and LMIC research organizations to acceler‐ ate their clinical trial process maturity in the field of drug discovery, population health innovation initiatives and public domain knowledge networks. When sponsored, this concept will be tested by 12 confirmed clinical research and public health organizations in six countries. The potential impact of this platform is reduced drug discovery and public health innovation lag time and improved clinical trial interventions, due to reliable clinical intelligence and bio-surveillance across all phases of the clinical innovation process.
BUSINESS MODEL PATTERNS FOR DISRUPTIVE TECHNOLOGIES
BENJAMIN AMSHOFF; CHRISTIAN DÜLME; JULIAN ECHTERFELD; JÜRGEN GAUSEMEIER
2015-01-01
Companies nowadays face a myriad of business opportunities as a direct consequence of manifold disruptive technology developments. As a basic characteristic, disruptive technologies lead to a severe shift in value-creation networks giving rise to new market segments. One of the key challenges is to anticipate the business logics within these nascent and formerly unknown markets. Business model patterns promise to tackle this challenge. They can be interpreted as proven business model elements...
Finto Antony; Laurence R. Schimleck; Alex Clark; Richard F. Daniels
2012-01-01
Specific gravity (SG) and moisture content (MC) both have a strong influence on the quantity and quality of wood fiber. We proposed a multivariate mixed model system to model the two properties simultaneously. Disk SG and MC at different height levels were measured from 3 trees in 135 stands across the natural range of loblolly pine and the stand level values were used...
Song, Seung Yeob; Lee, Young Koung; Kim, In-Jung
2016-01-01
A high-throughput screening system for Citrus lines were established with higher sugar and acid contents using Fourier transform infrared (FT-IR) spectroscopy in combination with multivariate analysis. FT-IR spectra confirmed typical spectral differences between the frequency regions of 950-1100 cm(-1), 1300-1500 cm(-1), and 1500-1700 cm(-1). Principal component analysis (PCA) and subsequent partial least square-discriminant analysis (PLS-DA) were able to discriminate five Citrus lines into three separate clusters corresponding to their taxonomic relationships. The quantitative predictive modeling of sugar and acid contents from Citrus fruits was established using partial least square regression algorithms from FT-IR spectra. The regression coefficients (R(2)) between predicted values and estimated sugar and acid content values were 0.99. These results demonstrate that by using FT-IR spectra and applying quantitative prediction modeling to Citrus sugar and acid contents, excellent Citrus lines can be early detected with greater accuracy.
Multivariate strategies in functional magnetic resonance imaging
DEFF Research Database (Denmark)
Hansen, Lars Kai
2007-01-01
We discuss aspects of multivariate fMRI modeling, including the statistical evaluation of multivariate models and means for dimensional reduction. In a case study we analyze linear and non-linear dimensional reduction tools in the context of a `mind reading' predictive multivariate fMRI model....
Transient multivariable sensor evaluation
Energy Technology Data Exchange (ETDEWEB)
Vilim, Richard B.; Heifetz, Alexander
2017-02-21
A method and system for performing transient multivariable sensor evaluation. The method and system includes a computer system for identifying a model form, providing training measurement data, generating a basis vector, monitoring system data from sensor, loading the system data in a non-transient memory, performing an estimation to provide desired data and comparing the system data to the desired data and outputting an alarm for a defective sensor.
Transient multivariable sensor evaluation
Vilim, Richard B.; Heifetz, Alexander
2017-02-21
A method and system for performing transient multivariable sensor evaluation. The method and system includes a computer system for identifying a model form, providing training measurement data, generating a basis vector, monitoring system data from sensor, loading the system data in a non-transient memory, performing an estimation to provide desired data and comparing the system data to the desired data and outputting an alarm for a defective sensor.
Jensen, Dan B; Hogeveen, Henk; De Vries, Albert
2016-09-01
Rapid detection of dairy cow mastitis is important so corrective action can be taken as soon as possible. Automatically collected sensor data used to monitor the performance and the health state of the cow could be useful for rapid detection of mastitis while reducing the labor needs for monitoring. The state of the art in combining sensor data to predict clinical mastitis still does not perform well enough to be applied in practice. Our objective was to combine a multivariate dynamic linear model (DLM) with a naïve Bayesian classifier (NBC) in a novel method using sensor and nonsensor data to detect clinical cases of mastitis. We also evaluated reductions in the number of sensors for detecting mastitis. With the DLM, we co-modeled 7 sources of sensor data (milk yield, fat, protein, lactose, conductivity, blood, body weight) collected at each milking for individual cows to produce one-step-ahead forecasts for each sensor. The observations were subsequently categorized according to the errors of the forecasted values and the estimated forecast variance. The categorized sensor data were combined with other data pertaining to the cow (week in milk, parity, mastitis history, somatic cell count category, and season) using Bayes' theorem, which produced a combined probability of the cow having clinical mastitis. If this probability was above a set threshold, the cow was classified as mastitis positive. To illustrate the performance of our method, we used sensor data from 1,003,207 milkings from the University of Florida Dairy Unit collected from 2008 to 2014. Of these, 2,907 milkings were associated with recorded cases of clinical mastitis. Using the DLM/NBC method, we reached an area under the receiver operating characteristic curve of 0.89, with a specificity of 0.81 when the sensitivity was set at 0.80. Specificities with omissions of sensor data ranged from 0.58 to 0.81. These results are comparable to other studies, but differences in data quality, definitions of
Venkatapathi, Murugesan; Rajwa, Bartek; Ragheb, Kathy; Banada, Padmapriya P.; Lary, Todd; Robinson, J. Paul; Hirleman, E. Daniel
2008-02-01
We describe a model-based instrument design combined with a statistical classification approach for the development and realization of high speed cell classification systems based on light scatter. In our work, angular light scatter from cells of four bacterial species of interest, Bacillus subtilis, Escherichia coli, Listeria innocua, and Enterococcus faecalis, was modeled using the discrete dipole approximation. We then optimized a scattering detector array design subject to some hardware constraints, configured the instrument, and gathered experimental data from the relevant bacterial cells. Using these models and experiments, it is shown that optimization using a nominal bacteria model (i.e., using a representative size and refractive index) is insufficient for classification of most bacteria in realistic applications. Hence the computational predictions were constituted in the form of scattering-data-vector distributions that accounted for expected variability in the physical properties between individual bacteria within the four species. After the detectors were optimized using the numerical results, they were used to measure scatter from both the known control samples and unknown bacterial cells. A multivariate statistical method based on a support vector machine (SVM) was used to classify the bacteria species based on light scatter signatures. In our final instrument, we realized correct classification of B. subtilis in the presence of E. coli,L. innocua, and E. faecalis using SVM at 99.1%, 99.6%, and 98.5%, respectively, in the optimal detector array configuration. For comparison, the corresponding values for another set of angles were only 69.9%, 71.7%, and 70.2% using SVM, and more importantly, this improved performance is consistent with classification predictions.
Wang, X; Li, L; Yang, Z; Zheng, X; Yu, S; Xu, C; Hu, Z
2017-03-01
Genomic selection (GS) is more efficient than traditional phenotype-based methods in hybrid breeding. The present study investigated the predictive ability of genomic best linear unbiased prediction models for rice hybrids based on the North Carolina mating design II, in which a total of 115 inbred rice lines were crossed with 5 male sterile lines. Using 8 traits of the 575 (115 × 5) hybrids from two environments, both univariate (UV) and multivariate (MV) prediction analyses, including additive and dominance effects, were performed. Using UV models, the prediction results of cross-validation indicated that including dominance effects could improve the predictive ability for some traits in rice hybrids. Additionally, we could take advantage of GS even for a low-heritability trait, such as grain yield per plant (GY), because a modest increase in the number of top selection could generate a higher, more stable mean phenotypic value for rice hybrids. Thus this strategy was used to select superior potential crosses between the 115 inbred lines and those between the 5 male sterile lines and other genotyped varieties. In our MV research, an MV model (MV-ADV) was developed utilizing a MV relationship matrix constructed with auxiliary variates. Based on joint analysis with multi-trait (MT) or with multi-environment, the prediction results confirmed the superiority of MV-ADV over an UV model, particularly in the MT scenario for a low-heritability target trait (such as GY), with highly correlated auxiliary traits. For a high-heritability trait (such as thousand-grain weight), MT prediction is unnecessary, and UV prediction is sufficient.
Venkatapathi, Murugesan; Rajwa, Bartek; Ragheb, Kathy; Banada, Padmapriya P; Lary, Todd; Robinson, J Paul; Hirleman, E Daniel
2008-02-10
We describe a model-based instrument design combined with a statistical classification approach for the development and realization of high speed cell classification systems based on light scatter. In our work, angular light scatter from cells of four bacterial species of interest, Bacillus subtilis, Escherichia coli, Listeria innocua, and Enterococcus faecalis, was modeled using the discrete dipole approximation. We then optimized a scattering detector array design subject to some hardware constraints, configured the instrument, and gathered experimental data from the relevant bacterial cells. Using these models and experiments, it is shown that optimization using a nominal bacteria model (i.e., using a representative size and refractive index) is insufficient for classification of most bacteria in realistic applications. Hence the computational predictions were constituted in the form of scattering-data-vector distributions that accounted for expected variability in the physical properties between individual bacteria within the four species. After the detectors were optimized using the numerical results, they were used to measure scatter from both the known control samples and unknown bacterial cells. A multivariate statistical method based on a support vector machine (SVM) was used to classify the bacteria species based on light scatter signatures. In our final instrument, we realized correct classification of B. subtilis in the presence of E. coli,L. innocua, and E. faecalis using SVM at 99.1%, 99.6%, and 98.5%, respectively, in the optimal detector array configuration. For comparison, the corresponding values for another set of angles were only 69.9%, 71.7%, and 70.2% using SVM, and more importantly, this improved performance is consistent with classification predictions.
Andrade, A I A S S; Stigter, T Y
2013-04-01
In this study multivariate and geostatistical methods are jointly applied to model the spatial and temporal distribution of arsenic (As) concentrations in shallow groundwater as a function of physicochemical, hydrogeological and land use parameters, as well as to assess the related uncertainty. The study site is located in the Mondego River alluvial body in Central Portugal, where maize, rice and some vegetable crops dominate. In a first analysis scatter plots are used, followed by the application of principal component analysis to two different data matrices, of 112 and 200 samples, with the aim of detecting associations between As levels and other quantitative parameters. In the following phase explanatory models of As are created through factorial regression based on correspondence analysis, integrating both quantitative and qualitative parameters. Finally, these are combined with indicator-geostatistical techniques to create maps indicating the predicted probability of As concentrations in groundwater exceeding the current global drinking water guideline of 10 μg/l. These maps further allow assessing the uncertainty and representativeness of the monitoring network. A clear effect of the redox state on the presence of As is observed, and together with significant correlations with dissolved oxygen, nitrate, sulfate, iron, manganese and alkalinity, points towards the reductive dissolution of Fe (hydr)oxides as the essential mechanism of As release. The association of high As values with rice crop, known to promote reduced environments due to ponding, further corroborates this hypothesis. An additional source of As from fertilizers cannot be excluded, as the correlation with As is higher where rice is associated with vegetables, normally associated with higher fertilization rates. The best explanatory model of As occurrence integrates the parameters season, crop type, well and water depth, nitrate and Eh, though a model without the last two parameters also gives
Raji, M A; Frycák, P; Temiyasathit, C; Kim, S B; Mavromaras, G; Ahn, J-M; Schug, K A
2009-07-01
Response factors were determined for twelve GXG peptides (where G stands for glycine and X is any of alanine [A], arginine [R], asparagine [N], aspartic acid [D], glycine [G], histidine [H], leucine [L], lysine [K], phenylalanine [F], serine [S], tyrosine [Y], valine [V]) by electrospray ionization mass spectrometry (ESI-MS). The response factors were measured using a novel flow injection method. This new method is based on the Gaussian distribution of analyte concentration resulting from band-broadening dispersion experienced by the analyte upon passage through an extended volume of PEEK tubing. This method removes the need for preparing a discrete series of standard solutions to assess concentration-dependent response. Relative response factors were calculated for each peptide with reference to GGG. The observed trends in the relative response factors were correlated with several analyte physicochemical parameters, chosen based on current understanding of ion release from charged droplets during the ESI process. These include analyte properties: nonpolar surface area; polar surface area; gas-phase basicity; proton affinity; and Log D. Multivariate statistical analysis using multiple linear regression, decision tree, and support vector regression models were investigated to assess their potential for predicting ESI response based on the analyte properties. The support vector regression model was more versatile and produced the least predictive error following 12-fold cross-validation. The effect of variation in solution pH on the relative response factors is highlighted, as evidenced by the different predictive models obtained for peptide response at two pH values (pH = 6.0 and 9.0). The relationship between physicochemical parameters and associated ionization efficiencies for GXG tripeptides is discussed based on the equilibrium partitioning model. Copyright 2009 John Wiley & Sons, Ltd.
Up-scaling of multi-variable flood loss models from objects to land use units at the meso-scale
Directory of Open Access Journals (Sweden)
H. Kreibich
2016-05-01
Full Text Available Flood risk management increasingly relies on risk analyses, including loss modelling. Most of the flood loss models usually applied in standard practice have in common that complex damaging processes are described by simple approaches like stage-damage functions. Novel multi-variable models significantly improve loss estimation on the micro-scale and may also be advantageous for large-scale applications. However, more input parameters also reveal additional uncertainty, even more in upscaling procedures for meso-scale applications, where the parameters need to be estimated on a regional area-wide basis. To gain more knowledge about challenges associated with the up-scaling of multi-variable flood loss models the following approach is applied: Single- and multi-variable micro-scale flood loss models are up-scaled and applied on the meso-scale, namely on basis of ATKIS land-use units. Application and validation is undertaken in 19 municipalities, which were affected during the 2002 flood by the River Mulde in Saxony, Germany by comparison to official loss data provided by the Saxon Relief Bank (SAB.In the meso-scale case study based model validation, most multi-variable models show smaller errors than the uni-variable stage-damage functions. The results show the suitability of the up-scaling approach, and, in accordance with micro-scale validation studies, that multi-variable models are an improvement in flood loss modelling also on the meso-scale. However, uncertainties remain high, stressing the importance of uncertainty quantification. Thus, the development of probabilistic loss models, like BT-FLEMO used in this study, which inherently provide uncertainty information are the way forward.
Ma, Junxiu; Qi, Juan; Gao, Xinyu; Yan, Chunhua; Zhang, Tianlong; Tang, Hongsheng
2017-01-01
3,5-Diamino-1,2,4-triazole (DAT) became a significant energetic materials intermediate, and the study of its reaction mechanism has fundamental significance in chemistry. The aim of this study is to investigate the ability of online attenuated total reflection infrared (ATR-IR) spectroscopy combined with the novel approach of hybrid hard- and soft-modelling multivariate curve resolution-alternating least squares (HS-MCR) analysis to monitor and detect changes in structural properties of compound during 3,5-diamino-1,2,4-triazole (DAT) synthesis processes. The subspace comparison method (SCM) was used to obtain the principal components number, and then the pure IR spectra of each substance were obtained by independent component analysis (ICA) and HS-MCR. The extent of rotation ambiguity was estimated from the band boundaries of feasible solutions calculated using the MCR-BANDS procedure. There were five principal components including two intermediates in the process in the results. The reaction rate constants of DAT formation reaction were also obtained by HS-MCR. HS-MCR was used to analyze spectroscopy data in chemical synthesis process, which not only increase the information domain but also reduce the ambiguities of the obtained results. This study provides the theoretical basis for the optimization of synthesis process and technology of energetic materials and provides a strong technical support of research and development of energy material with extraordinary damage effects. PMID:28386512
Ma, Junxiu; Qi, Juan; Gao, Xinyu; Yan, Chunhua; Zhang, Tianlong; Tang, Hongsheng; Li, Hua
2017-01-01
3,5-Diamino-1,2,4-triazole (DAT) became a significant energetic materials intermediate, and the study of its reaction mechanism has fundamental significance in chemistry. The aim of this study is to investigate the ability of online attenuated total reflection infrared (ATR-IR) spectroscopy combined with the novel approach of hybrid hard- and soft-modelling multivariate curve resolution-alternating least squares (HS-MCR) analysis to monitor and detect changes in structural properties of compound during 3,5-diamino-1,2,4-triazole (DAT) synthesis processes. The subspace comparison method (SCM) was used to obtain the principal components number, and then the pure IR spectra of each substance were obtained by independent component analysis (ICA) and HS-MCR. The extent of rotation ambiguity was estimated from the band boundaries of feasible solutions calculated using the MCR-BANDS procedure. There were five principal components including two intermediates in the process in the results. The reaction rate constants of DAT formation reaction were also obtained by HS-MCR. HS-MCR was used to analyze spectroscopy data in chemical synthesis process, which not only increase the information domain but also reduce the ambiguities of the obtained results. This study provides the theoretical basis for the optimization of synthesis process and technology of energetic materials and provides a strong technical support of research and development of energy material with extraordinary damage effects.
Directory of Open Access Journals (Sweden)
Junxiu Ma
2017-01-01
Full Text Available 3,5-Diamino-1,2,4-triazole (DAT became a significant energetic materials intermediate, and the study of its reaction mechanism has fundamental significance in chemistry. The aim of this study is to investigate the ability of online attenuated total reflection infrared (ATR-IR spectroscopy combined with the novel approach of hybrid hard- and soft-modelling multivariate curve resolution-alternating least squares (HS-MCR analysis to monitor and detect changes in structural properties of compound during 3,5-diamino-1,2,4-triazole (DAT synthesis processes. The subspace comparison method (SCM was used to obtain the principal components number, and then the pure IR spectra of each substance were obtained by independent component analysis (ICA and HS-MCR. The extent of rotation ambiguity was estimated from the band boundaries of feasible solutions calculated using the MCR-BANDS procedure. There were five principal components including two intermediates in the process in the results. The reaction rate constants of DAT formation reaction were also obtained by HS-MCR. HS-MCR was used to analyze spectroscopy data in chemical synthesis process, which not only increase the information domain but also reduce the ambiguities of the obtained results. This study provides the theoretical basis for the optimization of synthesis process and technology of energetic materials and provides a strong technical support of research and development of energy material with extraordinary damage effects.
Advances in technology: commercialization models
Marinakis, Yorgos D.
2016-01-01
Through the lens of the philosophy of science, we reconsider things that are currently being taken for granted and locate issues that are not currently being treated. In general, that lens has been more focused on views of scientific theories rather than theories of models. A philosophy of sciencety
Directory of Open Access Journals (Sweden)
Sepedeh Gholizadeh
2016-07-01
Full Text Available Background:Obesity and hypertension are the most important non-communicable diseases thatin many studies, the prevalence and their risk factors have been performedin each geographic region univariately.Study of factors affecting both obesity and hypertension may have an important role which to be adrressed in this study. Materials &Methods:This cross-sectional study was conducted on 1000 men aged 20-70 living in Bushehr province. Blood pressure was measured three times and the average of them was considered as one of the response variables. Hypertension was defined as systolic blood pressure ≥140 (and-or diastolic blood pressure ≥90 and obesity was defined as body mass index ≥25. Data was analyzed by using multilevel, multivariate logistic regression model by MlwiNsoftware. Results:Intra class correlations in cluster level obtained 33% for high blood pressure and 37% for obesity, so two level model was fitted to data. The prevalence of obesity and hypertension obtained 43.6% (0.95%CI; 40.6-46.5, 29.4% (0.95%CI; 26.6-32.1 respectively. Age, gender, smoking, hyperlipidemia, diabetes, fruit and vegetable consumption and physical activity were the factors affecting blood pressure (p≤0.05. Age, gender, hyperlipidemia, diabetes, fruit and vegetable consumption, physical activity and place of residence are effective on obesity (p≤0.05. Conclusion: The multilevel models with considering levels distribution provide more precise estimates. As regards obesity and hypertension are the major risk factors for cardiovascular disease, by knowing the high-risk groups we can d careful planning to prevention of non-communicable diseases and promotion of society health.
Institute of Scientific and Technical Information of China (English)
Li-gang MA; Jin-song DENG; Huai YANG; Yang HONG; Ke WANG
2015-01-01
The Chinese ZY-1 02C satellite is one of the most advanced high-resolution earth observation systems designed for terrestrial resource monitoring. Its capability for comprehensive landscape classification, especially in urban areas, has been under constant study. In view of the limited spectral resolution of the ZY-1 02C satellite (three bands), and the complexity and hetero-geneity across urban environments, we attempt to test its performance of urban landscape classification by combining a multi-variable model with an object-oriented approach. The multiple variables including spectral reflection, texture, spatial autocorre-lation, impervious surface fraction, vegetation, and geometry indexes were first calculated and selected using forward stepwise linear discriminant analysis and applied in the following object-oriented classification process. Comprehensive accuracy as-sessment which adopts traditional error matrices with stratified random samples and polygon area consistency (PAC) indexes was then conducted to examine the real area agreement between a classified polygon and its references. Results indicated an overall classification accuracy of 92.63%and a kappa statistic of 0.9124. Furthermore, the proposed PAC index showed that more than 82%of all polygons were correctly classified. Misclassification occurred mostly between residential area and barren/farmland. The presented method and the Chinese ZY-1 02C satellite imagery are robust and effective for urban landscape classification.
Grinn-Gofroń, Agnieszka; Strzelczak, Agnieszka
2009-11-01
A study was made of the link between time of day, weather variables and the hourly content of certain fungal spores in the atmosphere of the city of Szczecin, Poland, in 2004-2007. Sampling was carried out with a Lanzoni 7-day-recording spore trap. The spores analysed belonged to the taxa Alternaria and Cladosporium. These spores were selected both for their allergenic capacity and for their high level presence in the atmosphere, particularly during summer. Spearman correlation coefficients between spore concentrations, meteorological parameters and time of day showed different indices depending on the taxon being analysed. Relative humidity (RH), air temperature, air pressure and clouds most strongly and significantly influenced the concentration of Alternaria spores. Cladosporium spores correlated less strongly and significantly than Alternaria. Multivariate regression tree analysis revealed that, at air pressures lower than 1,011 hPa the concentration of Alternaria spores was low. Under higher air pressure spore concentrations were higher, particularly when RH was lower than 36.5%. In the case of Cladosporium, under higher air pressure (>1,008 hPa), the spores analysed were more abundant, particularly after 0330 hours. In artificial neural networks, RH, air pressure and air temperature were the most important variables in the model for Alternaria spore concentration. For Cladosporium, clouds, time of day, air pressure, wind speed and dew point temperature were highly significant factors influencing spore concentration. The maximum abundance of Cladosporium spores in air fell between 1200 and 1700 hours.
Adams, Dean C
2014-09-01
Studies of evolutionary correlations commonly use phylogenetic regression (i.e., independent contrasts and phylogenetic generalized least squares) to assess trait covariation in a phylogenetic context. However, while this approach is appropriate for evaluating trends in one or a few traits, it is incapable of assessing patterns in highly multivariate data, as the large number of variables relative to sample size prohibits parametric test statistics from being computed. This poses serious limitations for comparative biologists, who must either simplify how they quantify phenotypic traits, or alter the biological hypotheses they wish to examine. In this article, I propose a new statistical procedure for performing ANOVA and regression models in a phylogenetic context that can accommodate high-dimensional datasets. The approach is derived from the statistical equivalency between parametric methods using covariance matrices and methods based on distance matrices. Using simulations under Brownian motion, I show that the method displays appropriate Type I error rates and statistical power, whereas standard parametric procedures have decreasing power as data dimensionality increases. As such, the new procedure provides a useful means of assessing trait covariation across a set of taxa related by a phylogeny, enabling macroevolutionary biologists to test hypotheses of adaptation, and phenotypic change in high-dimensional datasets.
Simulation and Modeling Methodologies, Technologies and Applications
Filipe, Joaquim; Kacprzyk, Janusz; Pina, Nuno
2014-01-01
This book includes extended and revised versions of a set of selected papers from the 2012 International Conference on Simulation and Modeling Methodologies, Technologies and Applications (SIMULTECH 2012) which was sponsored by the Institute for Systems and Technologies of Information, Control and Communication (INSTICC) and held in Rome, Italy. SIMULTECH 2012 was technically co-sponsored by the Society for Modeling & Simulation International (SCS), GDR I3, Lionphant Simulation, Simulation Team and IFIP and held in cooperation with AIS Special Interest Group of Modeling and Simulation (AIS SIGMAS) and the Movimento Italiano Modellazione e Simulazione (MIMOS).
Theoretical Model of Steel Continuous Casting Technology
Institute of Scientific and Technical Information of China (English)
C Gheorghies; I Crudu; C Teletin; C Spanu
2009-01-01
A theoretical model adapted for studying steel continuous casting technology was proposed.The model based on system theory contained input/output,command,and control parameters.The process was divided into five stages,i.e.,tundish,mold,guiding system,guiding-drawing system,and guiding-drawing-soft reduction system.The model can be used to describe the physicoehemical processes,thermal processes,chemical processes,and characteristics of the cast material according to the above-mentioned stages.It can also be applied to other metallurgical technologies and even to other industries (chemistry,food,etc.).
Mathematical Modeling of the Agriculture Crop Technology
Directory of Open Access Journals (Sweden)
D. Drucioc
1999-02-01
Full Text Available The organized structure of computer system for economic and ecological estimation of agriculture crop technologies is described. The system is composed of six interconnected blocks. The linear, non-linear and stochastic mathematical models for machinery sizing and selection in farm-level cropping system is presented in the mathematical model block of computer system.
Endogenizing technological progress: The MESEMET model
P.A.G. van Bergeijk (Peter); G.H.A. van Hagen; R.A. de Mooij (Ruud); J. van Sinderen (Jarig)
1997-01-01
textabstractThis paper endogenizes technology and human capital formation in the MESEM model that was developed by van Sinderen (Economic Modelling, 1993, 13, 285-300). Tax allowances for private R&D expenditures and public expenditures on both education and R& D are effective instruments to stimula
Impact of Fractionation and Dose in a Multivariate Model for Radiation-Induced Chest Wall Pain
Energy Technology Data Exchange (ETDEWEB)
Din, Shaun U. [Department of Radiation Oncology, Memorial Sloan Kettering Cancer Center, New York, New York (United States); Williams, Eric L.; Jackson, Andrew [Department of Medical Physics, Memorial Sloan Kettering Cancer Center, New York, New York (United States); Rosenzweig, Kenneth E. [Department of Radiation Oncology, Mount Sinai Medical Center, New York, New York (United States); Wu, Abraham J.; Foster, Amanda [Department of Radiation Oncology, Memorial Sloan Kettering Cancer Center, New York, New York (United States); Yorke, Ellen D. [Department of Medical Physics, Memorial Sloan Kettering Cancer Center, New York, New York (United States); Rimner, Andreas, E-mail: rimnera@mskcc.org [Department of Radiation Oncology, Memorial Sloan Kettering Cancer Center, New York, New York (United States)
2015-10-01
Purpose: To determine the role of patient/tumor characteristics, radiation dose, and fractionation using the linear-quadratic (LQ) model to predict stereotactic body radiation therapy–induced grade ≥2 chest wall pain (CWP2) in a larger series and develop clinically useful constraints for patients treated with different fraction numbers. Methods and Materials: A total of 316 lung tumors in 295 patients were treated with stereotactic body radiation therapy in 3 to 5 fractions to 39 to 60 Gy. Absolute dose–absolute volume chest wall (CW) histograms were acquired. The raw dose-volume histograms (α/β = ∞ Gy) were converted via the LQ model to equivalent doses in 2-Gy fractions (normalized total dose, NTD) with α/β from 0 to 25 Gy in 0.1-Gy steps. The Cox proportional hazards (CPH) model was used in univariate and multivariate models to identify and assess CWP2 exposed to a given physical and NTD. Results: The median follow-up was 15.4 months, and the median time to development of CWP2 was 7.4 months. On a univariate CPH model, prescription dose, prescription dose per fraction, number of fractions, D83cc, distance of tumor to CW, and body mass index were all statistically significant for the development of CWP2. Linear-quadratic correction improved the CPH model significance over the physical dose. The best-fit α/β was 2.1 Gy, and the physical dose (α/β = ∞ Gy) was outside the upper 95% confidence limit. With α/β = 2.1 Gy, V{sub NTD99Gy} was most significant, with median V{sub NTD99Gy} = 31.5 cm{sup 3} (hazard ratio 3.87, P<.001). Conclusion: There were several predictive factors for the development of CWP2. The LQ-adjusted doses using the best-fit α/β = 2.1 Gy is a better predictor of CWP2 than the physical dose. To aid dosimetrists, we have calculated the physical dose equivalent corresponding to V{sub NTD99Gy} = 31.5 cm{sup 3} for the 3- to 5-fraction groups.
Institute of Scientific and Technical Information of China (English)
无
2001-01-01
It is well known that Landsat TM images are the most widely usedremote sensing data in various fields.Usually,it has 7 different electromagnetic spectrum bands,among which the sixth one has much lower ground resolution compared with the other six bands.Nevertheless,it is useful in the study of rock spectrum reflection,geo-thermal resources exploration,etc.To improve the ground resolution of TM6 to the level as that of the other six bands is a problem .This paper presents an algorithm based on the combination of multi-variate regression model with semi-variogram function which can improve the ground resolution of TM6 by "fusing" the data of other six bands.It includes the following main steps: (1) testing the correlation between TM6 and one of TM1-5,7.If the correlation coefficient between TM6 and another one is greater than a given threshold value,then select the band to the regression analysis as an argument.(2) calculating the size of the template window within which some parameters needed by the regression model will be calculated; (3) replacing the original pixel values of TM6 by those obtained by regression analysis; (4) using image entropy as a measurement to evaluate the quality of the fused image of TM6.The basic mechanism of the algorithm is discussed and the V C++ program for implementing this algorithm is also presented.A simple application example is given in the last part of this paper,showing the effectiveness of the algorithm.
Deo, Ravinesh C.; Kisi, Ozgur; Singh, Vijay P.
2017-02-01
Drought forecasting using standardized metrics of rainfall is a core task in hydrology and water resources management. Standardized Precipitation Index (SPI) is a rainfall-based metric that caters for different time-scales at which the drought occurs, and due to its standardization, is well-suited for forecasting drought at different periods in climatically diverse regions. This study advances drought modelling using multivariate adaptive regression splines (MARS), least square support vector machine (LSSVM), and M5Tree models by forecasting SPI in eastern Australia. MARS model incorporated rainfall as mandatory predictor with month (periodicity), Southern Oscillation Index, Pacific Decadal Oscillation Index and Indian Ocean Dipole, ENSO Modoki and Nino 3.0, 3.4 and 4.0 data added gradually. The performance was evaluated with root mean square error (RMSE), mean absolute error (MAE), and coefficient of determination (r2). Best MARS model required different input combinations, where rainfall, sea surface temperature and periodicity were used for all stations, but ENSO Modoki and Pacific Decadal Oscillation indices were not required for Bathurst, Collarenebri and Yamba, and the Southern Oscillation Index was not required for Collarenebri. Inclusion of periodicity increased the r2 value by 0.5-8.1% and reduced RMSE by 3.0-178.5%. Comparisons showed that MARS superseded the performance of the other counterparts for three out of five stations with lower MAE by 15.0-73.9% and 7.3-42.2%, respectively. For the other stations, M5Tree was better than MARS/LSSVM with lower MAE by 13.8-13.4% and 25.7-52.2%, respectively, and for Bathurst, LSSVM yielded more accurate result. For droughts identified by SPI ≤ - 0.5, accurate forecasts were attained by MARS/M5Tree for Bathurst, Yamba and Peak Hill, whereas for Collarenebri and Barraba, M5Tree was better than LSSVM/MARS. Seasonal analysis revealed disparate results where MARS/M5Tree was better than LSSVM. The results highlight the
Schafer, R. M.; Sain, M. K.
1980-01-01
The paper presents the CARDIAD (complex acceptability region for diagonal dominance) method for achieving the diagonal dominance condition in the inverse Nyquist array approach to the analysis and design of multivariable systems in the frequency domain. A design example is given for a sixth order, 4-input, 4-output model of a turbofan engine.
Accurate, nonintrusive, and inexpensive techniques are needed to measure energy expenditure (EE) in free-living populations. Our primary aim in this study was to validate cross-sectional time series (CSTS) and multivariate adaptive regression splines (MARS) models based on observable participant cha...
Chiarello, Lisa A.; Palisano, Robert J.; Bartlett, Doreen J.; McCoy, Sarah Westcott
2011-01-01
A multivariate model of determinants of change in gross-motor ability and engagement in self-care and play provides physical and occupational therapists a framework for decisions on interventions and supports for young children with cerebral palsy and their families. Aspects of the child, family ecology, and rehabilitation and community services…
Chiarello, Lisa A.; Palisano, Robert J.; Bartlett, Doreen J.; McCoy, Sarah Westcott
2011-01-01
A multivariate model of determinants of change in gross-motor ability and engagement in self-care and play provides physical and occupational therapists a framework for decisions on interventions and supports for young children with cerebral palsy and their families. Aspects of the child, family ecology, and rehabilitation and community services…
Capacity Expansion Modeling for Storage Technologies
Energy Technology Data Exchange (ETDEWEB)
Hale, Elaine; Stoll, Brady; Mai, Trieu
2017-04-03
The Resource Planning Model (RPM) is a capacity expansion model designed for regional power systems and high levels of renewable generation. Recent extensions capture value-stacking for storage technologies, including batteries and concentrating solar power with storage. After estimating per-unit capacity value and curtailment reduction potential, RPM co-optimizes investment decisions and reduced-form dispatch, accounting for planning reserves; energy value, including arbitrage and curtailment reduction; and three types of operating reserves. Multiple technology cost scenarios are analyzed to determine level of deployment in the Western Interconnection under various conditions.
Directory of Open Access Journals (Sweden)
Shuntaro Okazaki
Full Text Available People's behaviors synchronize. It is difficult, however, to determine whether synchronized behaviors occur in a mutual direction--two individuals influencing one another--or in one direction--one individual leading the other, and what the underlying mechanism for synchronization is. To answer these questions, we hypothesized a non-leader-follower postural sway synchronization, caused by a reciprocal visuo-postural feedback system operating on pairs of individuals, and tested that hypothesis both experimentally and via simulation. In the behavioral experiment, 22 participant pairs stood face to face either 20 or 70 cm away from each other wearing glasses with or without vision blocking lenses. The existence and direction of visual information exchanged between pairs of participants were systematically manipulated. The time series data for the postural sway of these pairs were recorded and analyzed with cross correlation and causality. Results of cross correlation showed that postural sway of paired participants was synchronized, with a shorter time lag when participant pairs could see one another's head motion than when one of the participants was blindfolded. In addition, there was less of a time lag in the observed synchronization when the distance between participant pairs was smaller. As for the causality analysis, noise contribution ratio (NCR, the measure of influence using a multivariate autoregressive model, was also computed to identify the degree to which one's postural sway is explained by that of the other's and how visual information (sighted vs. blindfolded interacts with paired participants' postural sway. It was found that for synchronization to take place, it is crucial that paired participants be sighted and exert equal influence on one another by simultaneously exchanging visual information. Furthermore, a simulation for the proposed system with a wider range of visual input showed a pattern of results similar to the
Okazaki, Shuntaro; Hirotani, Masako; Koike, Takahiko; Bosch-Bayard, Jorge; Takahashi, Haruka K; Hashiguchi, Maho; Sadato, Norihiro
2015-01-01
People's behaviors synchronize. It is difficult, however, to determine whether synchronized behaviors occur in a mutual direction--two individuals influencing one another--or in one direction--one individual leading the other, and what the underlying mechanism for synchronization is. To answer these questions, we hypothesized a non-leader-follower postural sway synchronization, caused by a reciprocal visuo-postural feedback system operating on pairs of individuals, and tested that hypothesis both experimentally and via simulation. In the behavioral experiment, 22 participant pairs stood face to face either 20 or 70 cm away from each other wearing glasses with or without vision blocking lenses. The existence and direction of visual information exchanged between pairs of participants were systematically manipulated. The time series data for the postural sway of these pairs were recorded and analyzed with cross correlation and causality. Results of cross correlation showed that postural sway of paired participants was synchronized, with a shorter time lag when participant pairs could see one another's head motion than when one of the participants was blindfolded. In addition, there was less of a time lag in the observed synchronization when the distance between participant pairs was smaller. As for the causality analysis, noise contribution ratio (NCR), the measure of influence using a multivariate autoregressive model, was also computed to identify the degree to which one's postural sway is explained by that of the other's and how visual information (sighted vs. blindfolded) interacts with paired participants' postural sway. It was found that for synchronization to take place, it is crucial that paired participants be sighted and exert equal influence on one another by simultaneously exchanging visual information. Furthermore, a simulation for the proposed system with a wider range of visual input showed a pattern of results similar to the behavioral results.
A primer of multivariate statistics
Harris, Richard J
2014-01-01
Drawing upon more than 30 years of experience in working with statistics, Dr. Richard J. Harris has updated A Primer of Multivariate Statistics to provide a model of balance between how-to and why. This classic text covers multivariate techniques with a taste of latent variable approaches. Throughout the book there is a focus on the importance of describing and testing one's interpretations of the emergent variables that are produced by multivariate analysis. This edition retains its conversational writing style while focusing on classical techniques. The book gives the reader a feel for why
Exemplary Training Models in Industrial Technology.
Hatton, Michael J., Comp.
Prepared by Canadian, Chinese Taipei, and Thai educational agencies and based on surveys of Asia Pacific Economic Cooperation member nations, this report provides descriptions of 52 exemplary industrial technology training models in Australia, Brunei, Canada, Chinese Taipei, Hong Kong, Malaysia, New Zealand, the Philippines, the People's Republic…
A Cyber War Modeling Approach Based on Multivariate Network%基于多变元网络的Cyber作战建模方法
Institute of Scientific and Technical Information of China (English)
邓志宏; 老松杨; 白亮
2013-01-01
Cyberspace是近年出现的一个新概念,分析了Cyberspace的概念及特性,提出Cyberspace层次概念模型,在此基础上,提出基于多变元网络的Cyber作战建模方法,认为Cyber作战建模可以分为对战争系统静态结构建模和交战行为动态交互建模,并对应分别建立了Cyberspace战争系统的多变元网络模型和Cyber作战交战行为的多变元网络演化模型,最后运用所建立的模型对一个典型Cyber战进行了分析.%Cyberspace is a new emerging concept,we analyze the conception and features of Cyberspace,and propose a hierarchy conceptual framework for Cyberspace.After that,we propose a Cyber war modeling approach based on multivariate network,considering that Cyber war modeling contains two parts,that is (i)modeling the static structure of warfare system (ii)modeling the dynamic process of engaging.Accordingly,a multivariate network model for warfare system and a multivariate network evolving model for engaging process of Cyber war are established.