Is there any known relationship between $\sum_{i=1}^{n} f(i)$ and $\sum_{i=1}^{n} \dfrac {1}{f(i)}$

$\newcommand{\angles}[1]{\left\langle\,{#1}\,\right\rangle} \newcommand{\braces}[1]{\left\lbrace\,{#1}\,\right\rbrace} \newcommand{\bracks}[1]{\left\lbrack\,{#1}\,\right\rbrack} \newcommand{\dd}{\mathrm{d}} \newcommand{\ds}[1]{\displaystyle{#1}} \newcommand{\expo}[1]{\,\mathrm{e}^{#1}\,} \newcommand{\half}{{1 \over 2}} \newcommand{\ic}{\mathrm{i}} \newcommand{\iff}{\Longleftrightarrow} \newcommand{\imp}{\Longrightarrow} \newcommand{\Li}[1]{\,\mathrm{Li}_{#1}} \newcommand{\ol}[1]{\overline{#1}} \newcommand{\pars}[1]{\left(\,{#1}\,\right)} \newcommand{\partiald}[3][]{\frac{\partial^{#1} #2}{\partial #3^{#1}}} \newcommand{\ul}[1]{\underline{#1}} \newcommand{\root}[2][]{\,\sqrt[#1]{\,{#2}\,}\,} \newcommand{\totald}[3][]{\frac{\mathrm{d}^{#1} #2}{\mathrm{d} #3^{#1}}} \newcommand{\verts}[1]{\left\vert\,{#1}\,\right\vert}$ \begin{align} \color{#f00}{\sum_{1 = 1}^{n}\mathrm{f}\pars{i} + \sum_{1 = 1}^{n}{1 \over \mathrm{f}\pars{i}}} & \geq \color{#f00}{2n} \end{align}


When $n=\infty$, the reciprocal sum will either blow up or oscillate forever by the arguments in the comments. For finite $n$, there is rarely much you can do.

Here's a well known example of when you do know the reciprocal:

$$\sum_{d|n}d=n\sum_{d|n}\frac{1}{d},$$

where the sum is over the divisors of $n$. Another example is $f(i)=p^i$ in which case you get:

$$\sum_{i=1}^nf(i)=\frac{1-p^{n+1}}{1-p}=:S$$ $$\sum_{i=1}^n\frac{1}{f(i)}=\frac{S}{p^n}$$

For a more classical example, take $f(i)=i^r$. Then the sum of $f(i)$ will be a Faulhaber polynomial, whereas the reciprocal sum will be a generalized harmonic number. The two aren't really related to each other in an obvious way In general, you have:

$$\sum_{i=1}^n \frac{1}{f(i)}=\frac{\sum_{i=1}^n\prod_{j\neq i}f(j)}{\prod_{i=1}^nf(i)}.$$

The numerator is quite annoying here, and generally intractable.


Let $a_i = f(i)$, for $i = 1, 2, \ldots, n$. Then you are asking for a relationship between $$ e_1 = p_1 = \sum_{i=1}^n a_i $$ and $$ p_{-1} = \sum_{i=1}^n \frac{1}{a_i}. $$ Where did this terminology come from? $e_i$ is the $i$th elementary symmetric polynomial in variables $a_1, \ldots, a_n$, and is defined for $i = 0$ to $i = n$. We have \begin{align*} e_0 &= 1 \\ e_1 &= a_1 + a_2 + \ldots + a_n \\ e_2 &= a_1a_2 + a_1a_3 + \ldots + a_{n-1}a_n \\ e_3 &= a_1a_2a_3 + a_1a_2a_4 + \ldots + a_{n-2}a_{n-1}a_n \\ &\cdots \\ e_n &= a_1a_2a_3\ldots a_n \end{align*} And $p_i$ (for any integer $i$) is the sum of the $i$th powers of the $a_i$: \begin{align*} &\cdots \\ p_{-1} &= a_1^{-1} + a_2^{-1} + \cdots + a_n^{-1} = \frac{1}{a_1} + \frac{1}{a_2} + \cdots + \frac{1}{a_n} \\ p_0 &= 1 + 1 + \cdots + 1 = n \\ p_1 &= a_1 + a_2 + \cdots + a_n \\ p_2 &= a_1^2 + a_2^2 + \cdots + a_n^2 \\ &\cdots \end{align*}

So, is there any known relationship?

If you're looking for an answer involving only these two quantities and simple expresions, the answer is no. There cannot be any relationship, because the two expressions $p_1$ and $p_{-1}$ are independent functions of the variables. By this I mean you can find two sequences $a_1, a_2, \ldots, a_n$ and $a_1', a_2', \ldots, a_n'$ that give the same value for $p_1$, but different values for $p_{-1}$, and vice versa, so there is no simple formula for one in terms of the other, nor any general dependency.

However, if you're willing to allow other quantities into the mix and just want to know how the two quantities are related, symmetric polynomials are where to look. There is an entire lass of identities known as Newton's Identities relating $p_i$ to each other and to $e_i$. (Unfortunately, the wikipedia case only defines $p_k$ for $k \ge 1$, but there's no need to be that restrictive.) So you're looking for a relationship between $e_1$ and $p_{-1}$. What such relationships exist? Well, for example, we have $$ p_{-1} = \frac{e_{n-1}}{e_n} $$ at which point an equation relating $p_{-1}$ and $e_1$ is the same an equation relating $e_1, e_{n-1},$ and $e_n$, such as $$ p_0 e_n - e_{n-1} p_1 + e_{n-2} p_2 - e_{n-3} p_3 + \cdots + (-1)^n p_n e_0 = 0. $$ In general, the relationship between these symmetric polynomials $p_i$ and $e_i$ is such that if you fix $n$ symmetric polynomials, you can compute all the rest in terms of those $n$ (often, you have to recursively compute them) but until you have fixed $n$ you are interested in, you won't be able to get full relationships between any. So you want a relationship between $e_1$ and $p_{-1}$, but you can only express a relationship that depends on some $n$ symmetric polynomial inputs (and you haven't said which $n$).


If you are lucky with the expression of $f$, you can play a bit applying the Mobius inversion formula.

Let $f, g : \mathbb{N} \mapsto G$ two functions from $\mathbb{N}$ to an addictive abelian group $G$.

Then

\begin{equation} f(n) = \sum_{d|n} g(d) \Leftrightarrow g(n) = \sum_{d|n} \mu(\frac{n}{d}) f(d) = \sum_{d|n} \mu(d) f(\frac{n}{d}) \end{equation}

where $\mu$ is the Mobius function defined as

  • $\mu(n) = 1$ if $n=1$
  • $\mu(n) = (−1)^k$ if $n$ is a square-free positive integer with $k$ different prime factors.
  • $\mu(n) = 0$ if $n$ has a squared prime factor.

Then some results could follow from letting $g = \frac{1}{f}$.