Optimizing computational-complexity of linear-algebraic operations
In your solution, $O(nd)$ is what you have to pay for every new vector $\mathbf{w}$ with your solution: if you want to evaluate $f$ on $m \gg n$ inputs, then your cost scales as $m n d$ instead of the $md^2$ asked. (I am assuming here $d \ll n$, otherwise your solution is indeed faster).
To obtain what is expected, rewrite $$\begin{align*} \sum_{i=1}^{n}\sum_{j=1}^{n}(\mathbf{a}_{i}^\intercal \mathbf{w}−\mathbf{b}_{j}^\intercal \mathbf{w})^2 &= \sum_{i=1}^{n}\sum_{j=1}^{n} \sum_{k=1}^d\sum_{\ell=1}^d(\mathbf{a}_{i,k}−\mathbf{b}_{j,k})\mathbf{w}_k(\mathbf{a}_{i,\ell}−\mathbf{b}_{j,\ell})\mathbf{w}_\ell \\ &= \sum_{k=1}^d\sum_{\ell=1}^d \underbrace{\left( \sum_{i=1}^{n}\sum_{j=1}^{n}(\mathbf{a}_{i,k}−\mathbf{b}_{j,k})(\mathbf{a}_{i,\ell}−\mathbf{b}_{j,\ell}) \right)}_{\gamma_{k,\ell}}\mathbf{w}_k\mathbf{w}_\ell \end{align*}$$ Now, using the same idea as you have, show that you can compute each $\gamma_{k,\ell}$ (which do not depend on $\mathbf{w}$!) ahead of time in time $O(n)$. Thus, computing all of them takes preprocessing time $O(nd^2)$.
Once you have that, given a new $\mathbf{w}$ you only have to compute $$ f(\mathbf{w}) = \sum_{k=1}^d\sum_{\ell=1}^d \gamma_{k,\ell} \mathbf{w}_k\mathbf{w}_\ell + \lambda\sum_{k=1}^d \mathbf{w}_k^2 $$ which takes time $O(d^2+d)=O(d^2)$.