This website requires JavaScript.

The Kolmogorov Superposition Theorem can Break the Curse of Dimensionality When Approximating High Dimensional Functions

Ming-Jun LaiZhaiming Shen
摘要
We explain how to use Kolmogorov's Superposition Theorem (KST) to overcomethe curse of dimensionality in approximating multi-dimensional functions andlearning multi-dimensional data sets by using neural networks of two layers.That is, there is a class of functions called $K$-Lipschitz continuous in thesense that the K-outer function $g$ of $f$ is Lipschitz continuous can beapproximated by a ReLU network of two layers with $dn, n$ widths to have anapproximation order $O(d^2/n)$. In addition, we show that polynomials of highdegree can be expressed by using neural networks with activation function$\sigma_\ell(t)=(t_+)^\ell$ with $\ell\ge 2$ with multiple layers andappropriate widths. More layers of neural networks, the higher degreepolynomials can be reproduced. Hence, the deep learning algorithm can wellapproximate multi-dimensional data when the number of layers increases withhigh degree activation function $\sigma_\ell$. Finally, we present amathematical justification for image classification by using a deep learningalgorithm.
展开全部
图表提取

暂无人提供速读十问回答

论文十问由沈向洋博士提出,鼓励大家带着这十个问题去阅读论文,用有用的信息构建认知模型。写出自己的十问回答,还有机会在当前页面展示哦。

Q1论文试图解决什么问题?
Q2这是否是一个新的问题?
Q3这篇文章要验证一个什么科学假设?
0
被引用
笔记
问答