Find the covariances of a multinomial distribution
We can use indicator random variables to help simplify the covariance expression. We can interpret the problem as $r$ independent rolls of an $n$ sided die. Let $X_i$ be the number of rolls that result in side $i$ facing up, and let $I_{k}^{(i)}$ be an indicator equal to $1$ when roll $k$ is equal to $i$ and $0$ otherwise. Then, we can express $X_i$ and $X_j$ as follows:
$$\begin{equation} X_i = \sum_{k=1}^{r} I_{k}^{(i)}~~~\mathrm{and}~~~X_j = \sum_{k=1}^{r} I_{k}^{(j)} \end{equation}$$
Let's re-write the covariance using indicators: $$\begin{equation} \mathrm{Cov}(X_i,X_j) = E[X_i X_j] - E[X_i]E[X_j] \end{equation}$$ Let's compute the first term: $$\begin{eqnarray} E[X_i X_j] &=& E\bigg[(\sum_{k=1}^{r}I_{k}^{(i)}) (\sum_{l=1}^{r}I_{l}^{(j)})\bigg] = \sum_{k=l}E\big[I_{k}^{(i)}I_{l}^{(j)}\big] + \sum_{k\neq l}E\big[I_{k}^{(i)}I_{l}^{(j)}\big] = \\ &=& 0 + \sum_{k\neq l}E\big[I_{k}^{(i)}\big] E\big[I_{l}^{(j)}\big] = \sum_{k\neq l} p_i p_j = (r^2 - r)p_i p_j \end{eqnarray}$$ where we expanded the product of sums, used linearity of expectation and the fact that when $k=l$ we can't simultaneously roll $i$ and $j$ on the same trial $k=l$ (making the product of indicators zero) Finally we applied independence of rolls that enabled us to write it as a product of probabilities. Let's compute the remaining term: $$\begin{equation} E[X_i] = E[\sum_{k=1}^{r}I_{k}^{(i)}] = \sum_{k=1}^{r}E[I_{k}^{(i)}] = rp_i \end{equation}$$ Therefore, the covariance equals: $$\begin{equation} \mathrm{Cov}(X_i,X_j) = E[X_i X_j] - E[X_i]E[X_j] = (r^2-r)p_ip_j - r^2p_ip_j = -r p_i p_j \end{equation}$$ Notice that $\mathrm{Cov}(X_i, X_j) = -r p_i p_j < 0$ is negative, this makes sense intuitively since for a fixed number of rolls $r$, if we roll many outcomes $i$, this reduces the number of possible outcomes $j$, and therefore $X_i$ and $X_j$ are negatively correlated!