TY - JOUR
T1 - Supervised Learning of Lyapunov Functions Using Laplace Averages of Approximate Koopman Eigenfunctions
AU - Deka, Shankar A.
AU - Dimarogonas, Dimos V.
PY - 2023/1/1
Y1 - 2023/1/1
N2 - Modern data-driven techniques have rapidly progressed beyond modelling and systems identification, with a growing interest in learning high-level dynamical properties of a system, such as safe-set invariance, reachability, input-to-state stability etc. In this letter, we propose a novel supervised Deep Learning technique for constructing Lyapunov certificates, by leveraging Koopman Operator theory-based numerical tools (Extended Dynamic Mode Decomposition and Generalized Laplace Analysis) to robustly and efficiently generate explicit ground truth data for training. This is in stark contrast to existing Deep Learning methods where the loss functions plainly penalize Lyapunov condition violation in the absence of labelled data for direct regression. Furthermore, our approach leads to a linear parameterization of Lyapunov candidate functions in terms of stable eigenfunctions of the Koopman operator, making them more interpretable compared to standard DNN-based architecture. We demonstrate and validate our approach numerically using 2-dimensional and 10-dimensional examples.
AB - Modern data-driven techniques have rapidly progressed beyond modelling and systems identification, with a growing interest in learning high-level dynamical properties of a system, such as safe-set invariance, reachability, input-to-state stability etc. In this letter, we propose a novel supervised Deep Learning technique for constructing Lyapunov certificates, by leveraging Koopman Operator theory-based numerical tools (Extended Dynamic Mode Decomposition and Generalized Laplace Analysis) to robustly and efficiently generate explicit ground truth data for training. This is in stark contrast to existing Deep Learning methods where the loss functions plainly penalize Lyapunov condition violation in the absence of labelled data for direct regression. Furthermore, our approach leads to a linear parameterization of Lyapunov candidate functions in terms of stable eigenfunctions of the Koopman operator, making them more interpretable compared to standard DNN-based architecture. We demonstrate and validate our approach numerically using 2-dimensional and 10-dimensional examples.
KW - Eigenvalues and eigenfunctions
KW - Lyapunov methods
KW - Trajectory
KW - Convergence
KW - Deep learning
KW - Neural networks
KW - Stability analysis
UR - https://ieeexplore.ieee.org/document/10171181/
U2 - 10.1109/LCSYS.2023.3291657
DO - 10.1109/LCSYS.2023.3291657
M3 - Article
SN - 2475-1456
VL - 7
SP - 3072
EP - 3077
JO - IEEE Control Systems Letters
JF - IEEE Control Systems Letters
M1 - 10171181
ER -