This website requires JavaScript.

# Free probability via entropic optimal transport

Feb 2024
0被引用
0笔记

Let $\mu$ and $\nu$ be probability measures on $\mathbb{R}$ with bounded support, and let $\mu \boxplus \nu$ denote their additive free convolution. We show that for $z \in \mathbb{R}$ greater than the sum of essential suprema of $\mu$ and $\nu$, we have \begin{equation*} \int_{-\infty}^\infty \log(z - x) \mu \boxplus \nu (\mathrm{d}x) = \sup_{\Pi} \left\{ \mathbf{E}_\Pi[\log(z - (X+Y)] - H(\Pi|\mu \otimes \nu) \right\}, \end{equation*} where the supremum is taken over all couplings $\Pi$ of the probability laws $\mu$ and $\nu$, and $H(\Pi|\mu \otimes \nu)$ denotes the relative entropy of a coupling $\Pi$ against the product measure. We prove similar formulas for the multiplicative free convolution $\mu \boxtimes \nu$ and the free compression $[\mu]_\tau$ of probability laws, as well as multivariate free operations. Thus the log-integrals against the basic measure operations of free probability can be formulated in terms of entropic optimal transport problem. The maximisers in our variational descriptions of these free operations on measures can be computed explicitly, and from these we can then deduce the standard $R$- and $S$-transform descriptions of additive and multiplicative free convolution. We use our formulation to derive new inequalities relating free and classical convolutions of random variables, such as the inequality \begin{equation*} \int_{-\infty}^\infty \log(z - x) \mu \boxplus \nu (\mathrm{d}x) \geq \int_{-\infty}^{\infty} \log(z-x) \mu \ast \nu( \mathrm{d}x). \end{equation*} Our approach is based on applying a large deviation principle on the symmetric group to the quadrature formulas of Marcus, Spielman and Srivastava.

AI理解论文&经典十问