Solution 1:

First, this can be checked locally, so we may as well assume $E = F\times M$.

Use coordinates $x_i$ on $F$ and $y_j$ on $M$. Then a $k-$form is given by $\alpha =\sum f_{IJ} dx^I\wedge dy^J$. Here, $I = \{i_1,...,i_s\}$ and $dx^I$ means $dx^{i_1}\wedge...\wedge dx^{i_s}$ and we have $|I|+|J|=k$.

The goal is to show that if $i_X(\alpha) = 0$ for all vertical $X$, then $f_{IJ} = 0$ whenever $I\neq \emptyset$.

So fix any $I\neq \emptyset$. Suppose $i_1\in I$. Let $X = \frac{\partial}{\partial x^{i_1}}$, the dual vector do $dx^{i_1}$. Then $0 = i_X(\alpha)$ so $ 0 = i_X(\alpha)(\frac{\partial}{\partial x^{I-i_1}}, \frac{\partial}{\partial x^{J}}) = \pm f_{IJ}$.

Thus the condition guarantees that $f_{IJ} = 0$ whenever $I\neq \emptyset$, so the only terms which appear in $\alpha$ are of the form $f_J dy^J$.

The only problem now is that $f_J$ could depend on the $F$ factor. This is where the Lie bracket term will come in.

The Lie bracket of a form is given by $L_Y(\alpha) = i_Y d\alpha + d i_Y(\alpha)$.

Taking $Y$ vertical, this reduces to $L_Y(\alpha) = i_Y(d\alpha)$ since we've just shown that $i_Y( \alpha) = 0$.

Now, $d\alpha = \sum \frac{\partial{f_J}}{{\partial x^i}} dx^i\wedge dy^J + \sum \frac{\partial{f_J}}{{\partial y^j}} dy^j\wedge dy^J$.

Just as before, by utilizing a dual basis, we can show that since $0 = L_X(\alpha) = di_X(\alpha)$, that $\frac{\partial{f_J}}{\partial x^i} = 0$.

But this implies that each $f_J$ only depends on the $y^j$ coordinates. It's now clear that $\alpha$ is the pull back of something, mainly of $\sum f_J dy^J$, which makes sense because $f$ is really only a function on $M$.

Solution 2:

If you added a connectedness assumption (which is essential, as I mentioned in my comment to your main question), what you desire follows from

Prop Let $f:M\to N$ be a submersion such that for every $y\in N$, $f^{-1}(y)$ is a connected submanifold of $M$. Then if $\mu\in\Omega^k(M)$ is a differential form satisfying $i_Y\mu = 0$ and $i_Y d\mu = 0$ for any $Y\in \ker df$, then there is a unique $\nu\in\Omega^k(N)$ such that $\mu = f^*\nu$.

Sketch of proof It suffices to show that the differential form defined by $\nu(f_*X_1,\ldots ,f_*X_k) = \mu(X_1,\ldots, X_k)$ is well-defined. By the assumption that $i_Y\mu = 0$ if $y\in \ker df$, at a fixed base point $x\in M$ you have that if $f_*X_1 = f_*X'_1$, $\mu(X_1,\ldots, X_k) = \mu(X'_1, X_2,\ldots,X_k)$. Now, it is clear that if $X$ is a vector field on $M$ such that its pushforward is well-defined (in the sense that there exists a vectorfield $Z$ on $N$ such that if $x_1,x_2\in f^{-1}(y)$, $df(X)|_{x_1} = df(X)|_{x_2} = Z|_y$) and if $Y$ is a vector field in $\ker df$, we have that $[X,Y] \in \ker df$. This can be easily checked by acting $[X,Y]$ on the pullback of functions defined on $N$. From this you can conclude that using $i_Yd\mu = 0$, for any vector fields $X_1, \ldots, X_k$ such that the pushforward is well-defined, $\mu(X_1,\ldots, X_k)$ is constant along connected components of $f^{-1}(y)$. q.e.d.


Addendum to address comments: Consider the scalar $\mu(X_1,\ldots, X_k)$ on $M$. Let $Y$ be a vertical vector field, then we have that $$ Y(\mu(X_1,\ldots,X_k)) = (\mathcal{L}_Y\mu)(X_1,\ldots,X_k) + \mu([Y,X_1],X_2,\ldots,X_k) + \cdots + \mu(X_1,\ldots, X_{k-1}, [Y,X_k]) $$ using Leibniz rule for Lie derivatives. So local constancy of the scalar field needs that $\mathcal{L}_Y\mu \equiv 0$ and that $[Y,X_l]$ is vertical.