The sum of powers of two and two's complement – is there a deeper meaning behind this?

Probably everyone has once come across the following "theorem" with corresponding "proof": $$\sum_{n=0}^\infty 2^n = -1$$ Proof: $\sum_{n=0}^\infty q^n = 1/(1-q)$. Insert $q=2$ to get the result.

Of course the "proof" neglects the condition on $q$ for this formula, and the sum really diverges. However I now noticed an interesting fact:

If you use two's complement to represent negative numbers on computers, $-1$ is represented by all bits set. Also, sign extending to a larger number of bits (that is, getting the same number in two's complement representation on more bits) works by copying the left-most bit (also known as sign bit) into the additional bits on the left.

Now imagine that formally you sign-extend the number $-1$ to infinitely many bits. What you get is an infinite-to-the-left string of $1$s. Which, using the normal base-2 formula $n = \sum_k b_k 2^k$ (where $b_k$ is the bit k positions from the right, i.e. $b_0$ is the rightmost bit), that infinite string of $1$s translates into exactly the sum above! So in some sense we have an independent re-derivation of that equation.

Now my question is: Is there something deeper behind this? Somehow I cannot imagine it is just coincidential.


Yes. What you are doing is known as working in the $2$-adic numbers.

The $2$-adic numbers are equipped with a curious notion of distance given by the $2$-adic metric. In this metric, two numbers are close together if their difference is divisible by a large power of $2$. In particular, large powers of $2$ are very small. So relative to the $2$-adic metric the geometric series you wrote down really does converge, and the value it converges to really is $-1$.