Prove or disprove: if $y'=y-y^2$ and $y(0)=a$ where $0<a<1$ then $a<y(x)\leq 1$ for any $x>0$

Prove or disprove: if $y'=y-y^2$ and $y(0)=a$ where $0<a<1$ then $a<y(x)\leq 1$ for any $x>0$.


I thought about using PL and Grownwall, but this led me to nothing.


Solution 1:

This is probably to be solved with the principles of one-dimensional dynamics. In $y'=f(y)$, with $f$ locally Lipschitz or even differentiable:

  • The roots of $f$ are the only stationary points, that is, give rise to constant solutions.
  • By uniqueness, no other solution can touch or cross the constant solutions. In the contraposition, solutions starting between roots of $f$ remain between these roots.
  • All non-constant solutions are strictly monotonous. In the case of a growing solution that is bounded above by a root, the solution exists for the whole positive time axis and converges towards this root. Falling and the behavior in direction of falling arguments give similar claims.

Now observe that $f(y)=y-y^2=y(1-y)$ has roots $0$ and $1$ and apply these principles to directly get the claim without computing the solutions.

Solution 2:

$$y' = y - y^2 \iff \dfrac{dy}{y(1-y)} = dx \iff y = \dfrac{e^x}{c + e^x}$$

Hence we have that $y(0) = a = \dfrac{1}{1+c} \iff c= \dfrac{1-a}{a} \ge 0$

Now we may deduce that $a < y(x) \le 1$