This website requires JavaScript.

# Data-Driven Linear Complexity Low-Rank Approximation of General Kernel Matrices: A Geometric Approach

Dec 2022

A general, {\em rectangular} kernel matrix may be defined as $K_{ij} =\kappa(x_i,y_j)$ where $\kappa(x,y)$ is a kernel function and where$X=\{x_i\}_{i=1}^m$ and $Y=\{y_i\}_{i=1}^n$ are two sets of points. In thispaper, we seek a low-rank approximation to a kernel matrix where the sets ofpoints $X$ and $Y$ are large and are not well-separated (e.g., the points in$X$ and $Y$ may be intermingled''). Such rectangular kernel matrices mayarise, for example, in Gaussian process regression where $X$ corresponds to thetraining data and $Y$ corresponds to the test data. In this case, the pointsare often high-dimensional. Since the point sets are large, we must exploit thefact that the matrix arises from a kernel function, and avoid forming thematrix, and thus ruling out most algebraic techniques. In particular, we seekmethods that can scale linearly, i.e., with computational complexity $O(m)$ or$O(n)$ for a fixed accuracy or rank. The main idea in this paper is to {\emgeometrically} select appropriate subsets of points to construct a low rankapproximation. An analysis in this paper guides how this selection should beperformed.

Q1论文试图解决什么问题？
Q2这是否是一个新的问题？
Q3这篇文章要验证一个什么科学假设？
0