We roll a six-sided die ten times. What is the probability that the total of all ten rolls is divisible by 6?

Hint.

Roll $9$ times and let $x$ be the total.

For exactly one number $n\in\{1,2,3,4,5,6\}$ we will have $6 \mid (x+n)$ (i.e. $x+n$ is divisible by $6$).


After rolling the die once, there is equal probability for each result modulo 6. Adding any unrelated integer to it will preserve the equidistribution. So you can even roll a 20-sided die afterwards and add its outcome: the total sum will still have a probability of 1/6 to be divisible by 6.


If you want something a little more formal and solid than drhab's clever and brilliant answer:

Let $P(k,n)$ be the probability of rolling a total with remainder $k$ when divided by $6, (k = 0...5)$ with $n$ die.

$P(k, 1)$ = Probability of rolling a $k$ if $k \ne 0$ or a $6$ if $k = 6$; $P(k, 1) = \frac 1 6$.

For $n > 1$. $ P(k,n) = \sum_{k= 0}^5 P(k, n-1)\cdot \text{Probability of Rolling(6-k)} = \sum_{k= 0}^5 P(k, n-1)\cdot\frac 1 6= \frac 1 6\sum_{k= 0}^5 P(k, n-1)= \frac 1 6 \cdot 1 = \frac 1 6$

This is drhab's answer but in formal terms without appeals to common sense


In spite of all great answers, given here, I say, why not give another proof, from another point of view. The problem is we have 10 random variables $X_i$ for $i=1,\dots,10$, defined over $[6]=\{1,\dots,6\}$, and we are interested in distribution of $Z$ defined as $$ Z=X_1\oplus X_2\oplus \dots \oplus X_{10} $$ where $\oplus$ is addition modulo $6$. We can go on by two different, yet similar proofs.


First proof: If $X_1$ and $X_2$ are two random variables over $[6]$, and $X_1$ is uniformly distributed, sheer calculation can show that $X_1\oplus X_2$ is also uniformly distributed. Same logic yields that $Z$ is uniformly distributed over $[6]$.

Remark: This proves a more general problem. It says that even if only one of the dices is fair dice, i.e. each side appearing with probability $\frac 16$, the distribution of $Z$ will be uniform and hence $\mathbb P(Z=0)=\frac 16$.


Second proof: This proof draws on (simple) information theoretic tools and assumes its background. The random variable $Z$ is output of an additive noisy channel and it is known that the worst case is uniformly distributed noise. In other word if $X_i$ is uniform for only one $i$, $Z$ will be uniform. To see this, suppose that $X_1$ is uniformly distributed. Then consider the following mutual information $I(X_2,X_3,\dots,X_6;Z)$ which can be written as $H(Z)-H(Z|X_2,\dots,X_6)$. But we have: $$ H(Z|X_2,\dots,X_6)=H(X_1|X_2,\dots,X_6)=H(X_1) $$
where the first equality is resulted from the fact that knowing $X_2,\dots,X_6$ the only uncertainty in $Z$ is due to $X_1$. The second equality is because $X_1$ is independent of others. Know see that:

  • Mutual information is positive: $H(Z)\geq H(X_1)$
  • Entropy of $Z$ is always less that or equal to the entropy of uniformly distributed random variable over $[6]$: $H(Z)\leq H(X_1)$
  • From the last two $H(Z)=H(X_1)$ and $Z$ is uniformly distributed and the proof is complete.

Similarly here, only one fair dice is enough. Moreover the same proof can be used for an arbitrary set $[n]$. As long as one of the $X_i$'s is uniform, then their finite sum modulo $n$ will be uniformly distributed.