This repository contains Python code for computing low-rank approximations of parametric kernel matrices with the tensor train decomposition. It accompanies the paper
Khan, A., & Saibaba, A. K. (2024). Parametric kernel low-rank approximations using tensor train decomposition. Submitted. arXiv preprint.
The results generated in the folder were generated by this repository on the Hazel Cluster at North Carolina State University.
The Python code requires the following packages to generate the tables.
- numpy
- scipy
- pandas
- tensorly
- pytorch
- numba
- scikit-learn
- tntorch(forked)
- py-markdown-table
- RP-Cholesky (note there is a local copy of RP-Cholesky in this repository for reproducibility)
To use these codes in your research, please see the License. If you find our code useful, please consider citing our paper.
@article{khan2024parametric,
title={Parametric kernel low-rank approximations using tensor train decomposition},
author={Khan, Abraham and Saibaba, Arvind K},
journal={arXiv preprint arXiv:2406.06344},
year={2024}
}
chmod +x setup_env.sh
./setup_env
chmod +x run_experiments.sh
./run_experiments.sh
- experiment_1 compares our TTK (Tensor Train Kernel) method with thin SVD and reproduces Table 2 in the paper.
- experiment_2 compares our PTTK method with Adaptive Cross Approximation (ACA) and reproduces Table 3 and Table 4 in the paper.
- experiment_3 performs PTTK-Global-1 and PTTK-Global-2 on symmetric kernel matrices and reproduces Table 5 in the paper.
- experiment_4 compares our PTTK-Global-1 and PTTK-Global-2 method with RP-Cholesky and Nystrom on symmetric positive semi-definite kernel matrices and reproduces Table 6 in the paper.
- experiment_5 compares our PTTK method with Tucker applied to parametric kernel matrices in three spatial dimensions, can easily be edited to two spatial dimensions by changing the dimension parameter and reproduces Table 7 in the paper (changing the dimension parameter to two reproduces Table 6 in the paper).
This work was funded by the National Science Foundation through the awards DMS-1845406, DMS-1821149, and DMS-2026830.