par Hallin, Marc ;Hörmann, Siegfried ;Lippi, Marco
Référence Statistical inference for stochastic processes, 21, 2, page (385-398)
Publication Publié, 2018-07
Article révisé par les pairs
Résumé : Dimension reduction techniques are at the core of the statistical analysis of high-dimensional and functional observations. Whether the data are vector- or function-valued, principal component techniques, in this context, play a central role. The success of principal components in the dimension reduction problem is explained by the fact that, for any K≤ p, the K first coefficients in the expansion of a p-dimensional random vector X in terms of its principal components is providing the best linear K-dimensional summary of X in the mean square sense. The same property holds true for a random function and its functional principal component expansion. This optimality feature, however, no longer holds true in a time series context: principal components and functional principal components, when the observations are serially dependent, are losing their optimal dimension reduction property to the so-called dynamic principal components introduced by Brillinger in 1981 in the vector case and, in the functional case, their functional extension proposed by Hörmann, Kidziński and Hallin in 2015.