Elsevier

Signal Processing

Volume 133, April 2017, Pages 122-134
Signal Processing

Conditional independence graphs for multivariate autoregressive models by convex optimization: Efficient algorithms

https://doi.org/10.1016/j.sigpro.2016.10.023Get rights and content

Highlights

  • Conditional independence graph of a vector autoregressive (VAR) process.

  • Efficient implementation of convex optimization algorithms for inferring the graph.

  • Derivation of renormalized maximum likelihood criterion for VAR-order selection.

Abstract

In this paper, we introduce novel algorithms for inferring the conditional independence graph of a vector autoregressive (VAR) process. As part of this work, we derive the renormalized maximum likelihood criterion for VAR-order selection and prove its consistency. Finding the graphical model for VAR reduces to identify the sparsity pattern of the inverse of its spectral density matrix; we show how efficient implementations of convex optimization algorithms can be used to solve this problem; in our approach, the high-sparsity assumption is not needed. We conduct experiments with simulated data, air pollution data and stock market data for demonstrating that our algorithms are faster and more accurate than similar methods proposed in the previous literature.

Section snippets

Motivation

In recent years, there is a very serious interest in the signal processing community for novel methods in graphical modeling of time series (see, for example, [1], [2]). The newly proposed algorithms are designed to have a relatively low computational complexity and, at the same time, to guarantee good estimation performance when a particular set of assumptions is satisfied. The use of convex optimization is deemed to be very promising in finding graphical models for multivariate time series [3]

Two-stage approach

Instead of the Full-Search from [3], we propose the two-stage approach, which is described in Algorithm 1. Remark that the stages for our estimation procedure are different from those in [6]. In our case, in Stage1, an ITC is employed to select the best order, say p^ (see Section 3.1). The most important consequence is that, in Stage2, all the estimations are performed only for order p^ and not for all the orders within the set {1,,pmax}.

Algorithm 1

Two-Stage Method

Stage1 [Select p^]:
 for all p{1,,pmax}

Main algorithmic steps and their computational complexity

Stage 1. This is a classical problem in signal processing and for solving it we employ the ARFIT algorithm which guarantees that the complexity of computing the estimates for all considered orders is O(TK2pmax2) (see [24, Section 3.3]). We note in passing that, in the ARFIT algorithm, VAR is recast in the form of a linear regression model (see also A.1). The estimates of the parameters are obtained by solving downdating least squares problems in which the order of the model is decreased from p

Model selection rules

For selecting the VAR-order we do not use only the RNML and SBC criteria, but also AIC [15], AICc [30], FPE – final prediction error [31], KIC – Kullback Information Criterion [32], KICc - “corrected” KIC [33]. Note that in [3] only SBC, AIC and AICc have been considered. We emphasize that the formula of AICc we use in Stage1 is not the same with the one from [3]. We prefer the AICc-expression from [30] because its derivation is tailored to VAR-models.

Another aspect concerns the alteration of

Final remarks

In this paper, we have proposed a family of algorithms for inferring the conditional independence graph of a VAR(p)-model for K-variate time series. Our theoretical and empirical results demonstrate that the algorithms from this family can be used when p20, K10 and K×p100. Thus far, the methods which rely on convex optimization and do not ask the user to make subjective choice of parameters have been suitable only for much smaller values of p and K. Another important feature of our method is

References (48)

  • T. Speed et al.

    Gaussian Markov distributions over finite graphs

    Ann. Stat.

    (1986)
  • Y. Matsuda

    A test statistic for graphical modelling of multivariate time series

    Biometrika

    (2006)
  • H. Lutkepöhl

    New Introduction to Multiple Time Series Analysis

    (2005)
  • M.Eichler, Fitting graphical interaction models to multivariate time series, in: Proceedings of the 22nd Conference...
  • S. Lauritzen

    Graphical Models

    (1996)
  • G. Schwarz

    Estimating the dimension of a model

    Ann. Stat.

    (1978)
  • H. Akaike

    A new look at the statistical model identification

    IEEE Trans. Autom. Control

    (1974)
  • P. Brockwell et al.

    Time Series: Theory and Methods

    (1991)
  • J. Songsiri et al.

    Topology selection in graphical models of autoregressive processes

    J. Mach. Learn. Res.

    (2010)
  • E. Avventi et al.

    ARMA identification of graphical models

    IEEE Trans. Autom. Control

    (2013)
  • M. Zorzi et al.

    AR identification of latent-variable graphical models

    IEEE Trans. Autom. Control

    (2016)
  • J. Rissanen

    MDL denoising

    IEEE Trans. Inf. Theory

    (2000)
  • J. Rissanen

    Information and Complexity in Statistical Modeling

    (2007)
  • S. Maanan, B. Dumitrescu, C. Giurcăneanu, Renormalized maximum likelihood for multivariate autoregressive models, in:...
  • Cited by (0)

    This work was supported by Dept. of Statistics (UOA) Doctoral Scholarship.

    View full text