Why is $3$ the multiplicative coefficient in the Collatz conjecture?
As explained in Isaac Solomon's answer just adding $1$ would make the problem simple. Multiplying by $2$ (or another even number) and adding $1$ would not work, so there one is at $3$.
One could ask the same question for $5$ instead of $3$, or there are other variants, but note the following:
If you assume a probabilistic model then in the multiplying by $3$ case you are in the following situation: in "half" the cases (even $n$) you divide by $2$, in "half" the cases (odd $n$) you take $3n+1$, but then you are guaranteed to divide by $2$ so effectively the situation is:
- "half" the time multiply by $\frac{1}{2}$
- "half" the time multiply by (roughly) $\frac{3}{2}$ (as the combination of $3n+1$ and the then forced $\frac{(3n+1)}{2}$).
The product of these two $\frac{1}{2}$ and $\frac{3}{2}$ is $\frac{3}{4}$ and this is less than 1, so in the long run you'd expect a decrease.
So the heuristic suggests a decrease in the long run. (This argument is rough but one could make it a bit more precise. However, I think to get a rough idea this might do.)
If you do the same with $5$ (or something still larger) instead of $3$, you'd have $ \frac{1}{2}$ and $\frac{5}{2}$ instead. So you'd get $\frac{5}{4}$ which is greater than $1$. So in the long run you'd expect an increase.
Indeed, one can also study this problem; so $5$ instead of $3$, but then the situation is different in that one will (it is believed) have some starting sequence that go to infinity and several small loops. So the question remains interesting, but it somehow changes as then one will have these values that escape to infinity.
More generally, there are numerous Collatz-like problems that are considered. But for some it can even be shown that they are formally undecidable.
Here is a possible explanation:
3
is the unique factor which, presumably, allows the effects of the3x+1
steps and the effects of thex/2
steps to compensate statistically. As such, Collatz algorithm is at the boundary of escaping to infinity and of hitting1
after a (long) while. Hence the excitement and, perhaps, the difficulty...
To explain the heuristics, consider that each 3x+1
step produces an even integer and is followed by one or several x/2
steps. For each $k\geqslant1$, the proportion of even integers requiring $k$ of these x/2
steps is $1/2^k$ hence the global effect of these x/2
steps on an even integer chosen at random is to multiply it by a factor whose mean is
$$
\frac12\cdot\frac12+\frac14\cdot\frac14+\cdots+\frac1{2^k}\cdot\frac1{2^k}+\cdots=\frac13.
$$
This counterbalances exactly, asymptotically, the effect of the unique 3x+1
step before a new succession of x/2
steps occurs.
One should mention that to make this argument rigorous is a daunting task. For example, it is far from obvious that the integer produced by the 3x+1
step is uniformly distributed, asymptotically. Thus, the heuristics above is almost certainly false, and the challenge is to show that it is not false enough to invalidate the conjecture that, for every starting point, Collatz algorithm hits the cycle 1→2→4→1
. Note finally that the analogy with some zero-drift random walk suggests that the hitting time of this cycle should be, finite yes, but typically large since null-recurrent random walks hit each given vertex after an almost surely finite, but not integrable, random time.
If you use $n \to n+1$, this will always go to $1$. This is because, if you start with a number $n$, there are three possibilities as to what might happen in the next two turns. If $n$ is odd, then
$$ n \to \frac{n+1}{2}$$
If $n$ is even, you either have
$$ n \to \frac{n}{2} + 1$$
or
$$ n \to \frac{n}{4}$$
You can check for yourself that if $n>2$, this always shrinks the value of $n$. Over time, we must arrive at $1$.
Heuristically, the growing forces $+1$ and the shrinking forces $/ 2$ are not balanced. Addition by $1$ is not as powerful as division by $2$.
The important of choosing $n \to 3n + 1$ is that division by $2$ is balanced by multiplying by $3$ and adding $1$, since on occasion we divide by $2$ more than once in a row. See Did's answer below for an explanation of how the balancing occurs.
Given a large set of numbers, half should have $\frac{3x+1}2$ applied and half $\frac x2$. Both $\frac{3x+1}2$ and $\frac x2$ are equally likely to produce even and odd numbers. So half the numbers produced by $\frac{3x+1}2$ should then have $\frac{3x+1}2$ applied again and half $\frac x2$; likewise, half the numbers produced by $\frac x2$ should then have $\frac{3x+1}2$ applied and half $\frac x2$. And so forth.
Consider the effect after $n$ iterations. $\frac1{2^n}\binom{n}{k}$ of the initial numbers will have $\frac{3x+1}2$ applied $n-k$ times and $\frac x2$ applied $k$ times. Thus, given an initial average value $m$, the expected ending value would be $$ \frac1{2^n}\sum_{k=0}^n\binom{n}{k}\left(\frac32\right)^{n-k}\left(\frac12\right)^km=m $$ Thus, the $\frac{3x+1}2$ and $\frac x2$ maps achieve a constancy in the expected value of their results.
Similarly, using the $\frac{x+1}2$ and $\frac x2$ maps, we would get an expected value of $$ \frac1{2^n}\sum_{k=0}^n\binom{n}{k}\left(\frac12\right)^{n-k}\left(\frac12\right)^km=\frac m{2^n} $$ This would give a precipitous decline in the expected value.