How to bound the error for the Taylor expansion of the inverse of a mean of exponentials?
If $|x| \leq R / 10$ for some $R\in \mathbb{N}$, then it is easily shown that $$\left|e^{-x} - \sum_{k=0}^R \frac{(-1)^k x^k}{k!}\right| \leq e^{-R}.$$
I would like to have a similar result (i.e. with an error bound of the same form) for the multivariate function $\frac{1}{\frac{1}{M} \sum_{i=1}^M e^{x_i}}$, where the necessary control on $M\in \mathbb{N}$ and on each $x_i$'s can be imposed.
Any ideas ?
After playing with formal Taylor series, here's a partial answer. We can write $$\sum_{i=1}^M e^{x_i} = \sum_{\alpha\in \mathbb{N}_0^M} c_{\alpha} x^{\alpha}, \quad \text{with } c_{\alpha} = \begin{cases} \frac{1}{k!}, &\mbox{if } \alpha = k e_k, \\ 0, &\mbox{otherwise}, \end{cases}$$ where $e_k$ is the $k$-th standard basis vector, and $x^{\alpha} = \prod_{i=1}^M x_i^{\alpha_i}$. Now the goal is to find coefficients $d_{\beta}$ such that $$\left(\sum_{\alpha\in \mathbb{N}_0^M} c_{\alpha} x^{\alpha}\right) \cdot \left(\sum_{\beta\in \mathbb{N}_0^M} d_{\beta} x^{\beta}\right) = \sum_{\gamma\in \mathbb{N}_0^M} x^{\gamma} \left(\sum_{\alpha + \beta = \gamma} c_{\alpha} d_{\beta}\right) = 1.$$
Solution 1:
Note the natural generalization of this inequality may not be $\frac{1}{\frac{1}{M} \sum_{i=1}^M e^{x_i}}$, but $\frac{1}{M} \sum_{i=1}^M e^{-x_i}$.
Considering the second case with $|x_i| \leq R/10$ gives $$\left|\frac{1}{M} \sum_{i=1}^M e^{-x_i} - \frac{1}{M} \sum_{i=1}^M \sum_{k=0}^R \frac{(-1)^k x_i^k}{k!} \right| \leq e^{-R}$$ by the triangle inequality.
Let $x_i, i=1,...,n$ be positive numbers with maximum term $x_m$. We claim the difference of their arithmetic mean and harmonic mean is maximized when one term goes to $0$ and the other terms are equal. Consider
$$ g(x_j) = \frac{x_1 + \cdots + x_n}{n} - \frac{n}{x_1^{-1} + \cdots + x_n^{-1}} \geq 0 $$ with equality iff $x_1 = \cdots = x_n$ by the HM-AM-GM inequality. If $x_j$ is the smallest term
$$g'(x_j) = \frac{1}{n} - \frac{n}{(\frac{x_j}{x_1} + \cdots + \frac{x_j}{x_n})^2} < 0$$ since $x_j/x_k \leq 1$. In particular $g(x_j)$ increases further until $x_j$ decreases to $0$. This shows the max has at least one $x_i$ zero. It follows the absolute max is $$ \frac{n-1}{n} x_m $$ For $|x_i| \leq R/10$ \begin{align} \left|\frac{1}{\frac{1}{M}\sum_{i=1}^M e^{x_i}} - \frac{1}{M} \sum_{i=1}^M \sum_{k=0}^R \frac{(-1)^k x_i^k}{k!}\right| \leq & \left|\frac{1}{\frac{1}{M}\sum_{i=1}^M e^{x_i}} - \frac{1}{M} \sum_{i=1}^M e^{-x_i}\right| \ \\ & \ \ \ + \left|\frac{1}{M} \sum_{i=1}^M e^{-x_i} - \frac{1}{M} \sum_{i=1}^M \sum_{k=0}^R \frac{(-1)^k x_i^k}{k!} \right| \\ \leq & \ e^{-R} + \left(1-\frac{1}{M}\right) \max \{e^{-x_i}\}_{i=1}^M \end{align}
Comment: Unfortunately the difference between the HM and AM destroyed the accuracy of the inequality. It doesn't seem easy to fix this so we need a new approach.
Denote $\phi(x_1) = \frac{1}{M} \sum_{i=1}^M \sum_{k=0}^R \frac{x_i^k}{k!}$. Replace $x_i$ with $-x_i$ in the first inequality implies $\phi(x_1)>e^{-R/10}-e^{-R} > 0$. Dividing the first inequality by $\phi(x_1) \sum_{i=1}^M e^{x_i}$ gives
\begin{align} \left|\frac{1}{\phi(x_1)} - \frac{1}{\frac{1}{M}\sum_{i=1}^M e^{x_i}}\right| &\leq \frac{e^{-R}}{\phi(x_1) \frac{1}{M} \sum_{i=1}^M e^{x_i}} \\ &\leq \frac{e^{-R}}{(e^{-R/10}-e^{-R})e^{-R/10}} \\ & = e^{-4R/5}(1-e^{-9R/10})^{-1} \\ & \leq e^{-4R/5}(1+e^{-9R/10}) \end{align} If you can calculate $\phi(x_1)$, dividing at the end doesn't seem a significant extra computation. If you still want $\frac{1}{\phi(x_1)}$ to be in the form of a truncated multivariable Taylor series its probably most straightforward to redo the problem calculating the Taylor series.
$f(\mathbf{x}) = e^{\mathbf{x} \cdot \nabla} f(y_1,...,y_M) \Bigg|_{y_1,...,y_M = 0}$. For us $$ \frac{M}{\sum_{i=1}^M e^{x_i}} = \sum_{k=0}^N \sum_{k_1 + \cdots + k_M = k} \frac{k!}{k_1!\cdots k_M!} \frac{\partial^k}{\partial y_1^{k_1} \cdots \partial y_M^{k_M}} \frac{M}{e^{y_1} + \cdots + e^{y_M}} \Bigg|_{y_1, ..., y_M = 0} x_1^{k_1} \cdots x_M^{k_M} + R_N $$
where $$R_N = \frac{\frac{d^{N+1}}{dt^{N+1}}f(t_0 \mathbf{x})}{(N+1)!}||\mathbb{x}||^{N+1}, \ \ t_0 \in [0,1]. $$ The coefficients can be put in closed form, but might not be simplifiable.
One way to turn over $1/\phi(x_1)$ is to use Burmann's Theorem, but the remainder won't be fun and might involve one over the other (x_i). "Cayley's Formula" says $$ \frac{1}{\phi(x_1)} = \frac{1}{\phi(0)} + \sum_{k=1}^\infty \frac{(-x_1)^k}{\phi(0)^{k+1}(k+1)!} (x \phi(x_1)^k)_{(k+1)x_1} $$ but this doesn't help either.