Can a cube always be fitted into the projection of a cube?

This is unfortunately not a "complete" answer, but there will not be enough space for me to do this in the comments, and at least, it is a partial answer...first, let me just lay down the boundaries here: I will provide an outline for a proof that a unit k-dimensional hypercube will always fit into an ORTHOGONAL projection of a n-dimensional hypercube onto a k-dimensional subspace. And I'll just be assuming the standard Euclidian inner product/norm.

So the first thing to notice is that the corners of a hypercube in $R^n$ is represented by a set of vectors $\{(x_1,\ldots,x_n)\in \mathbb{R}^n: x_i=1/2$ or $x_i=-1/2$ for each $1\leq i\leq n\}$, so in other words all possible permutations of $\{1/2,-1/2\}$ in the components of the vector. Now to make the argument simpler to follow from here on, let's rather work with the hypercube of side-length 2 centered around the origin, then we have: each corner is represented by vectors which are permutations of $\{-1,1\}$.

So the proof I have in mind is an induction proof. We have $\mathbb{R}^n$ with a hypercube of side-length 2 centered about the origin. First consider any 1-dimensional subspace of $\mathbb{R}^n$, spanned by a normal vector $\eta$. Now this vector is of the form $1/\sqrt{m}(l_1,\ldots,l_n)$ where $m=\sum_{i=1}^nl_i^2$. Now since the corner vectors of the unit hypercube consists of all possible permutation of $\{-1,1\}$ at least one of the corner vectors (say $v$) is such that $v\cdot \eta=1/\sqrt{m}\sum_{i=1}^n |l_i|$, in particular $v$ will be the corner vector where the sign of each component agrees with the sign of each component in $\eta$. So we have that the orthogonal projection of $v$ on $\eta$ is \begin{equation} (v\cdot \eta)\eta=(1/\sqrt{m}\sum_{i=1}^n |l_i|)(1/\sqrt{m}(l_1,\ldots,l_n)), \end{equation} and this vector has norm \begin{equation} \|(v\cdot \eta)\eta\|=\sqrt{\frac{(\sum_{i=1}^n |l_i|)^2}{m}} \geq 1, \end{equation} because by expansion of the multinomial $(\sum_{i=1}^n |l_i|)^2 \geq \sum_{i=1}^n l_i^2$.

Now if we take the corner vector $v'$ of the unit hypercube to be the vector such that $v'=-v$, then $(v'\cdot \eta)\eta=(-v\cdot \eta)\eta=-(v \cdot \eta)\eta$ and it follows that \begin{equation}\|(v \cdot \eta)\eta-(v'\cdot\eta)\eta\|=\|2(v \cdot \eta)\eta\|\geq 2. \end{equation} This means that the 1-dimensional hypercube of sidelength 2 will fit inside the projection as required.

Now the induction hypothesis is that for every $k$-dimensional subspace of $\mathbb{R}^n$ ($k\leq n-1$) we can fit a sidelength 2 $k$-dimensional hypercube into the orthogonal projection of the sidelength 2 $n$-dimensional hypercube onto this subspace.

Now take any $k+1$ dimensional subspace of $\mathbb{R}^n$. From here on, its just a sketch of the proof: For such a subspace (let's denote it as $W$) we can find an orthonormal basis, and consequently we can write it as a direct sum of orthogonal complements $W=W_1 \oplus W_1^{\perp}$. In particular we can let $W_1$ be any single vector in our chosen orthonormal basis for $W$. By the induction hypothesis we have that the $k$-dimensional hypercube on $W_1^{\perp}$ fits into the projection of the $n$-dimensional cube onto that space, and by the same argument the $1$-dimensional cube fits into the projection of the $n$-dimensional cube onto $W_1$.

SO NOW, the question is, if we take the $(k+1)$-dimensional hypercube constructed by extruding the $k$-dimensional cube in $W_1^{\perp}$ along the normal vector spanning $W_1$ (in both directions by 1 unit), can we then deduce that it will fit into the projection of the $n$-dimensional cube in $W$. This is where my proof is not complete...I think it comes down to choosing the orthonormal basis in a particular way, i.e. that the vector spanning $W_1$ is parallel to cutlines of the projection space $W$ where it intersects with the hypercube in $\mathbb{R}^n$, but unfortunately I have no rigorous way of defining this, or even to know if it is the best choice. From playing around with a plane intersecting a cube in $\mathbb{R}^3$ in mathematica I am quite convinced though that it is always possible to make a choice of basis on the plane so that this will work...

Now just also as a btw, from playing around with an oblique projection of a unit square onto the y-axis with projection matrix $\begin{bmatrix} 0 & 0 \\ a & 1 \end{bmatrix}$ where $a$ is any number, it actually seems as if orthogonal projection is the "worst case scenario"...so I think this is true for oblique projections as well, but I have no idea how to attempt to prove this.


Not an answer, but a few observations from my vague (possibly wrong) recollections of local theory:

  • It certainly has enough volume to contain the smaller cube. If we project $B_{\infty}^n$ onto the hyperplane $(x_1, x_2, ...\dots, x_n)^{\perp}$ with $\sum x_1^2=1$, then $$Vol(P_{H}B_\infty^n)=Vol(B_\infty^{n-1})\sum|x_i|\geq Vol(B_\infty^{n-1})$$

However that is not enough to contain the smaller cube.

  • By duality, we can look at the $B_1^n$. The ball $P_{H}B_\infty^n$ is the dual ball of a section of $B_1^n$ by a hyperplane. So the equivalent question would be: If we section $B_1^n$ by a hyperplane, can that section be fitted inside $B_1^{n-1}$?