Notions of stability for differential equations

$\newcommand{\CC}{{\mathbb{C}}}\newcommand{\RR}{{\mathbb{R}}}\newcommand{\ra}{\rightarrow}\newcommand{\ds}{\displaystyle}$The implication is not true without additional conditions upon the functions. It is not even true for $f,h$ independent of a function $u$. Therefore the problem is not (only) the compactness in the space of bounded functions.
Let me first recall a counterexample of stability theory that impressed me very much as a student.

Consider $r : \RR \times [0,1] \ra \RR$ given by $$r(t,a) = \frac{1+2a t^4}{1+t^2+a^3 t^6} \ \ \mbox{ for }t \in\RR, a \in [0,1].$$ Then $r$ is a positive function of class ${\mathcal C}^\infty$ and we have $\ds\lim_{t \ra +\infty} r(t,a) = 0\mbox{ for }a \in [0,1].$ The convergence is not uniform with respect to $a \in [0,1]$, however, because $\ds\lim_{a \ra 0} a^{1/2} r(a^{-3/4},a) = 1.$ Now define $s : \RR^3 \ra \RR^2$ by $$s(t,x) = \left(\frac{1}{r} \frac{\partial r}{\partial t}\right) \left(t,\frac{x^2 _2}{x^2 _1 + x^2 _2}\right) \cdot x {\rm ,if} \ \ x = (x_1, x_2) \neq 0,$$ whereas $s(t,(0,0))=0$. $s$ is continuous and satisfies a local Lipschitz condition with respect to $x$ (In the neighborhood of $x=(0,0)$, this is a bit tricky to verify. For the uniqueness of solutions to the initial value problems this is not needed, anyway). Then the initial value problem $$z' = s(t,z),\ \ z(t_0) = b\mbox{ with }t_0\in\RR, b \in \RR^2$$
has a unique solution and for $b= (b_1,b_2)\neq 0$ this solution is given by $$z(t) = \frac{r(t,a_0)}{r(t_0, a_0)} b \ \ {\rm with} \ \ a_0 = \frac{b^2 _2}{b^2 _1 + b^2 _2}.$$ Therefore all solutions of the differential equation tend to $0$ as $t\to+\infty$, but given $\varepsilon,M>0$, $t_0\in\RR$ there exists $b\in\RR^2$, $|b|<\varepsilon$ such that the solution of $z' = s(t,z),\ \ z(t_0) = b$ satisfies $||z||_\infty>M$. It is sufficient to choose first $a_0>0$ small enough such that $r(a_0^{-3/4},a_0)>2M\,r(t_0,a_0)/\varepsilon$, then $B$ with $|B|=1$ such that $a_0=\frac{B^2 _2}{B^2 _1 + B^2 _2}$ and finally $b=\frac\varepsilon2B$.

Now we adapt this example to the given $x,y$-system; more precisely, to one independent of $u$. With the function $s$ defined above, we put essentially $t=x_3$: $$f:\RR^3\to\RR^3,\ f(x_1,x_2,x_3)=(s(x_3,(x_1,x_2)),1)\mbox{ and }h:\RR^3\to\RR,\ h(x_1,x_2,x_3)=x_1^2+x_2^2.$$ The solution of $\dot x=f(x)$, $x(0)=x_0=(x_{01},x_{02},x_{03})$ is then $x_3(t)=x_{03}+t$ and $z(u)=(x_1(u-x_{03}),x_2(u-x_{03}))$ satisfies $$\frac{dz}{du}=s(u,z),\ \ z(x_{03})=(x_{01},x_{02}).$$ As seen above, for all solutions $x(t)$, the output $y(t)=x_1^2(t)+x_2^2(t)=||z(t+x_{03})||_2^2$ is bounded, but given $\varepsilon,M>0$, $x_{03}\in\RR$ there exists $(x_{01},x_{02})\in\RR^2$, of norm smaller than $\varepsilon$ such that the solution of $\dot x=f(x),\ x(0)=x_0, y=h(x)$ satisfies $||y||_\infty>M$. Therefore a function $g$ as desired in the question cannot exist.
Of course, the function $f$ in this example is not very smooth, but it seems to me that smoother examples could also be constructed.