This website requires JavaScript.

# The Kolmogorov Superposition Theorem can Break the Curse of Dimensionality When Approximating High Dimensional Functions

We explain how to use Kolmogorov's Superposition Theorem (KST) to overcomethe curse of dimensionality in approximating multi-dimensional functions andlearning multi-dimensional data sets by using neural networks of two layers.That is, there is a class of functions called $K$-Lipschitz continuous in thesense that the K-outer function $g$ of $f$ is Lipschitz continuous can beapproximated by a ReLU network of two layers with $dn, n$ widths to have anapproximation order $O(d^2/n)$. In addition, we show that polynomials of highdegree can be expressed by using neural networks with activation function$\sigma_\ell(t)=(t_+)^\ell$ with $\ell\ge 2$ with multiple layers andappropriate widths. More layers of neural networks, the higher degreepolynomials can be reproduced. Hence, the deep learning algorithm can wellapproximate multi-dimensional data when the number of layers increases withhigh degree activation function $\sigma_\ell$. Finally, we present amathematical justification for image classification by using a deep learningalgorithm.

Q1论文试图解决什么问题？
Q2这是否是一个新的问题？
Q3这篇文章要验证一个什么科学假设？
0