Let $(X, d)$ be complete and totally bounded. Then $X$ is compact

I'm trying to prove this interesting result. It took me almost $2$ hours to come up with the proof. The main idea is that at least one element of a finite partition of an infinite set is infinite. This is how "total boundedness" comes into the play. Could you have a check on my attempt?

Let $(X, d)$ be a complete and totally bounded metric space. Then $X$ is compact.

My attempt:

Lemma: Sequentially compact metric space is compact. [A proof is provided here]

By our lemma, it's sufficient to prove that $X$ is sequentially compact. Let $(x_m)$ be a sequence in $X$. Because $X$ is totally bounded, for each $n \in \mathbb N^*$, there exists a finite cover $C_n$ of the form $$C_n = \left \{\mathbb B \left (c_{n,k}, n^{-1} \right) \,\middle\vert\, k = 1, \ldots \varphi (n) \right \}.$$

We define an increasing map $\psi:\mathbb N^* \to \mathbb N$ and a decreasing sequence $(I_n)$ recursively as follows. Let $I_0 := \mathbb N$. There exists $\bar k \in \{1, \ldots \varphi (n)\}$ such that the set $$J :=\left \{m \in I_{n-1} \,\middle\vert\, x_m \in \mathbb B \left (c_{n,\bar k}, n^{-1} \right ) \right\}$$ is infinite. Let $\psi(n) := \min J$ and $$I_n := J \setminus \{\psi(n)\} \subsetneq I_{n-1}.$$

Indeed, $(x_{\psi(n)})$ is a Cauchy sequence. Given $\varepsilon >0$, there is $N \in \mathbb N$ such that $1/N < \varepsilon/2$. On the other hand, $x_{\psi(n)} \in I_N$ for all $n > N$. So $$d \left (x_{\psi(n)}, x_{\psi(n')} \right) \le d \left (x_{\psi(n)}, c_{n,\bar k} \right) + d \left (x_{\psi(n')}, c_{n,\bar k} \right) < 2N^{-1}, \quad \forall n,n' > N.$$

Because $X$ is complete, $(x_{\psi(n)})$ is convergent. This completes the proof.


The idea is OK, but IMHO, the writing-up could be clearer:

To see that a sequence $(x_n)$ has a Cauchy subsequence:

As $X$ is totally bounded it has a finite cover by balls with radius $\frac{1}{2}$. As the cover is finite, by the pigeon hole principle there is some infinite subset $I_1$ of $\Bbb N$ so that all $x_n, n \in I_1$ lie in one of the balls. By the triangle inequality (via the centre) we thus see

$$\forall n,m \in I_1: d(x_n,x_m) < 1$$

and define $n_1 = \min(I_1)$.

Now recursion: having defined an infinite $I_1, \ldots, I_n$ such that

$$\forall k,m \in I_n: d(x_{k},x_m) < \frac{1}{n}\tag{1}$$

and so that the $I_k$ for $k \le n$ are nested downward and $n_k \in I_k$ increasing for $1,2,\ldots,k$, we can define $I_{n+1}$ similarly: finitely many balls of radius $\frac{1}{2(n+1)}$ cover $X$ and we consider find an infinite $I_{n+1} \subseteq I_n\setminus \{n_k: k \le n\}$ so that $x_n, n \in I_{n+1}$ are in a single ball, so that again $(1)$ holds with $n$ replaced by $n+1$. Again defining $n_{n+1}=\min(I_{n+1})$ ensures that the nestedness and increasingness of the $n_k$ stays preserved.

Now it's clear that the subsequence $x_{n_1},x_{n_2},\ldots$ is Cauchy: given $\varepsilon>0$ find $N$ so that $\frac{1}{N} < \varepsilon$. Then all $x_{n_k}$ for $k \ge N$ are in $I_N$ by construction and then $(1)$ tells us that we have Cauchy-ness.

Then completeness does the rest: $(x_n)_n$ has a convergent subsequence.

I think this write-up, essentially the same idea though it has, is clearer and has less notational overload (no need to keep remembering the centres all the time; the only essential fact is that we have for each $r$ a finite cover by sets of diameter $< r$ and use that).