This website requires JavaScript.

Data-Driven Linear Complexity Low-Rank Approximation of General Kernel Matrices: A Geometric Approach

Difeng CaiEdmond ChowYuanzhe Xi
Dec 2022
摘要
A general, {\em rectangular} kernel matrix may be defined as $K_{ij} =\kappa(x_i,y_j)$ where $\kappa(x,y)$ is a kernel function and where$X=\{x_i\}_{i=1}^m$ and $Y=\{y_i\}_{i=1}^n$ are two sets of points. In thispaper, we seek a low-rank approximation to a kernel matrix where the sets ofpoints $X$ and $Y$ are large and are not well-separated (e.g., the points in$X$ and $Y$ may be ``intermingled''). Such rectangular kernel matrices mayarise, for example, in Gaussian process regression where $X$ corresponds to thetraining data and $Y$ corresponds to the test data. In this case, the pointsare often high-dimensional. Since the point sets are large, we must exploit thefact that the matrix arises from a kernel function, and avoid forming thematrix, and thus ruling out most algebraic techniques. In particular, we seekmethods that can scale linearly, i.e., with computational complexity $O(m)$ or$O(n)$ for a fixed accuracy or rank. The main idea in this paper is to {\emgeometrically} select appropriate subsets of points to construct a low rankapproximation. An analysis in this paper guides how this selection should beperformed.
展开全部
图表提取

暂无人提供速读十问回答

论文十问由沈向洋博士提出,鼓励大家带着这十个问题去阅读论文,用有用的信息构建认知模型。写出自己的十问回答,还有机会在当前页面展示哦。

Q1论文试图解决什么问题?
Q2这是否是一个新的问题?
Q3这篇文章要验证一个什么科学假设?
0
被引用
笔记
问答