How to justify claims on the complexity of formal proofs without definitions, as described in "Type Theory and Formal Proof" by Nederpelt and Geuvers
Since it is difficult to talk about arbitrary formal systems, I shall restrict this post to FOL theories, but the same notions and phenomena carry over to any reasonable formal system. "Definitions" here refers to definitorial expansion (and see the comments there for a citation).
It turns out that although definitorial expansion is conservative (i.e. every sentence in the original language that is proven by the expanded theory is also proven by the original theory), there can be exponential proof blowup if you insist on using the original theory.
To see why, consider adding predicate-symbols $Q_{0..m}$, where each $Q_{k+1}$ is defined by a formula that uses two occurrences of $Q_k$. Then translating any formula involving $Q_m$ into the original language would yield a formula with length exponential in $m$.
However, all the examples given in that chapter are covered by definitorial expansion, and there is certainly no more than exponential blowup. So I would say that it is wrong to claim "considerably worse than exponential growth" in complexity without these "definitions".
That said, once you reify definitions (i.e. have some rules/axioms that assert that certain kinds of definitions are captured by objects), so that you can quantify over those, then the 'efficacy' of the system can vastly outstrip the original even if it is conservative. One example of this reification is to get from PA to ACA0 (see this MO post). But this does not justify the claim you cited, because it is an argument for conservative second-order extension rather than definitorial expansion.