Time series analysis via maximum variance frames
Loading...
Date
2025-01-14
Authors
Bonenberger, Christopher
Journal Title
Journal ISSN
Volume Title
Publication Type
Dissertation
Published in
Abstract
Shift invariance is a central characteristic of time series data. Many methods in signal processing and machine learning exploit the corresponding local dependencies by means of convolutions. In this dissertation, we transfer the idea of shift invariance by convolutions to a basic machine learning setting. In short, we present a generalization of Principal Component Analysis (PCA) towards structured representation learning, to which we refer as Maximum Variance Frames (MVFs).
While PCA, a fundamental technique in machine learning and statistics, is based on pointwise correlations, the proposed approach allows to incorporate prior knowledge, foremost that time series data is governed by local dependencies. In this regard, we define a wide class of linear transformations that are based on different modifications of circulant matrices respectively circular convolutions. We state the general problem of learning corresponding structured data representations under the premise of variance maximization and provide analytical solutions to it. Along with the solution, we provide a formalism that consistently implements data transformations based on these structures in the form of a finite frame. Moreover, we examine several possibilities for fast implementations based on the fast Fourier transform.
Taking different structures as a basis for this kind of representation learning, the proposed approach allows to model a wide range of different statistical dependencies. In fact, several well-known time series analysis techniques arise as a special case of Maximum Variance Frames. More precisely, we prove that the discrete Fourier transform, Dynamic PCA and Singular Spectrum Analysis are incorporated within the proposed framework that offers exact mathematical descriptions of their relations. In addition, we show that optimization of a certain class of structures implicitly formulates a novel approach to power spectral density estimation and in addition establishes a corresponding estimator.
We demonstrate the performance of this estimator by comparing it to several well-known methods of spectral estimation. Moreover, we highlight the tight relations to subspace methods for spectral estimation and moving-average models.
Furthermore, the proposed framework is extended towards supervised learning in accordance with linear discriminant analysis. Beyond that, we show how Maximum Variance Frames can be used for image processing and we provide a corresponding nonlinear kernel algorithm analogously to kernel PCA. Finally, the presented framework contributes to a deeper understanding and more effective utilization of machine learning and time series analysis techniques, and provides new approaches for research and practical applications.
Description
Faculties
Fakultät für Ingenieurwissenschaften, Informatik und Psychologie
Institutions
Institut für Neuroinformatik
Citation
DFG Project uulm
EU Project THU
Other projects THU
License
Lizenz A
Is version of
Has version
Supplement to
Supplemented by
Has erratum
Erratum to
Has Part
Part of
DOI external
DOI external
Institutions
Periodical
Degree Program
DFG Project THU
item.page.thu.projectEU
item.page.thu.projectOther
Series
Keywords
Representation Learning, Frame Theory, Shift Invariance, Discrete Fourier Transform, Singular Spectrum Analysis, Adaptive Filtering, Spectral Estimation, Zeitreihenanalyse, Hauptkomponentenanalyse, Maschinelles Lernen, Signalverarbeitung, Frame-Theorie, Time-series analysis, Principal components analysis, Machine learning, Signal processing, Adaptive filters, DDC 620 / Engineering & allied operations