Proving that every isometry of $\mathbb{R}^n$ is of the form of a composition of at most $n+1$ reflections
I know, for example, the every isometry of $\mathbb{R}^3$ can be written as a composition of at most $4$ reflections (through planes that doesn't necessarily have the 0 vector in them).
I wish to prove to more general statement that says every isometry of $\mathbb{R}^n$ is of the form of a composition of at most $n+1$ reflections.
The proof for the cases $n=2,3$ that I saw are a bit technical and I don't think that rotating and reflecting in $\mathbb{R}^n$ is a good way to show this (though possible and probably a lot more technical...)
I need help with my tactic to prove this : I want to use induction, knowing this is true for $n=1$.
The idea is this: if I were to define an isometry of the required form that acts the same on: the $0$ vector,$e_{1},...,e_{n}$ then I would be done.
Now I want to use, somehow, the induction hypothesis, my thoughts are that the given isometry, $f$, takes $e_{1},...,e_{n}$ into $n$ points and from the induction hypothesis there is a composition of at most $n$ reflections that take the projection of $e_{1},...,e_{n}$ to $\mathbb{R}^{n-1}$ to the projection of $f(e_{1}),...,f(e_{n})$.
From here I'm a bit lost, I want to say that this build also takes the zero vector of $\mathbb{R}^{n-1}$ to the projection of $f(0)$ to $\mathbb{R}^{n-1}$ and that I can compose what I got with one more reflection in a way that won't ruin what I already did and also help take care of the last coordinate.
Any thoughts ?
Edit: The proof in the link in the accepted answer it correct, but it seems that the proof given in the other answer is what I am trying to do, I would appriciate if someone could go into more details (I commented to the answer what I don't understand in it).
Edit 2: I tried writing the proof and complete all the details, did I do this correctly ?
We will prove by induction that every isometry of $\mathbb{R}^n$ can be represented as a composition of at most $n+1$reflections.
Base case: We know that the claim is true for $n=1$(and $n=2$but I think that this will follow).
Step: Let $V:=\left\{ x\in\mathbb{R}^n|x_{n}=0\right\} $and let $f\in Iso(\mathbb{R}^n)$.
If $f(0)\neq0$ then there exist $\tau\in O(n)$ s.t $\tau\circ f(0)=0$ (explicitly: $\tau$ is the reflection around the plane consistent of all point in $\mathbb{R}^n$ with equal distans from both $0$ and $f(0)$).
We now assume $f(0)=0$ (otherwise we denote $g=\tau\circ f$ and continue with $g$ ).
Consider $f|_{V}$ : $f$ is an isometry of $\mathbb{R}^n$ hence $\forall x,y\in\mathbb{R}^n:d(x,y)=d(f(x),f(y))$, in particular this holds for very $x,y\in V$ .
Let $E$ be the projection $E:\mathbb{R}^n\to\mathbb{R}^{n-1}$. $E(V)=\mathbb{R}^{n-1}$and $E\circ f|_{V}$ is an isometry of $\mathbb{R}^{n-1}$ hence be the induction hypothesis it can be represented as $R=R_{1}\circ...\circ R_{n-1}$.
We now expand $R$ to $\mathbb{R}^n$ in this manner: denote the matrix representin $R$ as $R_{M}$ ($R_{M}\in M_{n-1}(\mathbb{R})$) then $R$ expanded to $\mathbb{R}^n$ is represented by $\begin{pmatrix}R & 0\\ 0 & 1 \end{pmatrix}$that is : on the first $n-1$coordinates we act the same as $R$ did and we keep the last coordinate (note that $|R_{M}|=1$ and that it is straightforward to check that $R_{M}$ is an orthogonal matrix, $R_{1},...,R_{n}$are expanded from $\mathbb{R}^{n-1}$to $R^{n}$ in the same manner).
Denote $\varphi=R^{-1}\circ f$ (here $R$ is the expended isometry to $\mathbb{R}^n$). if $\varphi(e_{n})=e_{n}$ then we are done, otherwise there exist a reflection $\alpha$ s.t $\alpha\circ\varphi(e_{n})=e_{n}$and s.t $\alpha|_{V}\equiv f|_{V}$. (explicitly: $\alpha$ is a rotation around the plane consistent of all points $v\in\mathbb{R}^n$ s.t $\langle v,e_{n}-f(e_{n})\rangle=0$).
$\varphi$ is at most $n-1$ reflections hence $\alpha\circ\varphi$ is at most $n$ reflections.
In te first case $(f(0)\neq0)$ we are doing the process with $\tau\circ f$ hence in this case $f$ is represented by at most $n+1$ reflections.
Any comment about the style of the proof is also welcomed, this is one of the times I am writing a proof in English.
Ok, here is my idea, I hope this is what you had in mind:
We use one reflection to get an isometry with $f(0)=0$. I will now assume that $f$ has this property and will show that $n$ reflections suffice. edit: I should probably say at this point that the reason for step 1 is to use that $f$ is now norm-preserving (and thereby distance preserving) which simplifies the notation.
I will inductively construct reflections $r_i$ such that their composition $r_1\circ\cdots\circ r_n$ is the inverse of $f$. Since reflections are their own inverse this proves the statement.
Assume that $f$ fixes a subspace $N$ of dimension $d$ (and this subspace is maximal in the sense that there exists no higher dimensional subspace which is fixed). We may then write $\mathbb R^n=N\bot N^\bot$. I will show that we can find a reflection $r$, such that $r\circ f$ fixes a subspace of one dimension higher.
Take $0\neq x\in N^\bot$. Then $x\neq f(x)\in N^\bot$ since $0=\left< x,n\right>=\left< f(x),f(n)\right>=\left< f(x),n\right>$ for all $n\in N$. Moreover $f(x)$ is either linearly independent to $x$ or just $-x$. Indeed assume $\lambda f(x)=x$, then $\left< x,x\right>=\left< f(x),f(x)\right>=\lambda^2\left< x,x\right>$ and $\lambda^2=1$ with $\lambda\neq 1$ by assumption. If $f(x)=-x$, we just reflect $x$ and keep $N$ fixed at the same time and are done. Hence assume $f(x)$ and $x$ are linear independent.
Find a basis of $N^\bot$ containing $x$ and $f(x)$ namely $\{x,f(x),b_1,...,b_k\}$. Choose $u=x+f(x)$ and $v=x-f(x)$. Then $\{u,v,b_1,...,b_k\}$ is also a basis of $N^\bot$, since $x=1/2(u+v)$ and $f(x)=1/2(u-v)$.
Take $r$ te be the reflection which takes $v$ to $-v$ and fixes everthing else. Then $r\circ f$ fixes the $d+1$-dimensional space $N\bot \mathbb Rx$.
After $n$ steps we have (at most) $n$ reflections such that $r_1\circ...\circ r_n\circ f$ is the identity. We are done.
I prove a special case below: An isometry $f$ of $\mathbb{R}^n$ such that $f(0)=0$ is the product of at most $n$ linear reflections. The general case follows from this since if $f(0) \neq 0$ then there exists a (unique) reflection $\sigma$ such that $(\sigma \circ f)(0) = 0$.
So assume $f(0)=0$. Let $V = \{x \in \mathbb{R}^n \mid x_n =0\}$. If $f(e_n) = e_n$ then $f|_V$ is an isometry of $V$. By induction it is the product of at most $n-1$ reflections in $V$. Extending these reflections to $\mathbb{R}^n$ results in a product for $f$ since $e_n$ is a fixed point.
If $f(e_n) \neq e_n$ then there is a unique linear reflection $\sigma$ such that $(\sigma \circ f)(e_n) = e_n$. (The mirror of $\sigma$ is perpendicular to $e_n - f(e_n)$.) This brings us back to the first case and so $\sigma \circ f$ is the product of at most $n-1$ reflections and $f = \sigma \circ (\sigma \circ f)$.
To solve this problem you can use matricial approach given in this article. See Theorem A.4 and Corollary A.7.