Is Lipschitz's condition necessary for existence of unique solution of an I.V.P.?

Is Lipschitz's condition necessary condition or sufficient condition for existence of unique solution of an Initial Value Problem ?


I saw in a book that it is sufficient condition. But I want an example to prove it sufficient. That is I want an example of an I.V.P. of the form $$\frac{dy}{dx}=f(x,y)\text{ , with initial condition } y(x_0)=y_0$$ in which $f(x,y)$ does not satisfy Lipchitz's condition although the I.V.P. has an unique solution.

Also , I saw in wikipedia that the I.V.P. $\frac{dy}{dx}=y^{1/3}$ , with initial condition $y(0)=0$ has three solutions. But how we get three solutions ?

When I solve the equation with initial condition then I get , $y=\left(\frac{2}{3}x\right)^{3/2}$

According to uranix's comment , when an I.V.P. has non-unique solution then we can put the solution in the form that given by uranix. So I think non-unique solution implies infinitely many solutions. So from where the question of existence of $2$ or $3$ or $4$ solutions arise ?

I asked for the problem $\frac{dy}{dx}=3y^{2/3}$ with $y(0)=0$ here and the answer of this question says that there are infinitely many solutions.


Now, in my mind following three questions arise :

$(1)$ Example of an I.V.P. in which $f(x,y)$ does not satisfy Lipschitz's condition but the I.V.P. has unique solution.

$(2)$ If an I.V.P. has non-unique solution then can we say that the I.V.P. has infinitely many solutions ?

$(3)$ If answer of $(2)$ is negative then how much solutions exist and how we find them ?

Can anyone help me to understand these properly ?


Solution 1:

Answer to Question 1. Lipschitz is sufficient but not necessary.

Fact I. Lipschitz condition is sufficient for uniqueness. This is a consequence of Picard-Lindelöf Theorem.

In particular, if $f:D\to\mathbb R^n$, is continuous, where $D\subset \mathbb R^{n+1}$, open, and $f=f(t,\boldsymbol{x})$, where $t\in \mathbb R$ and $\boldsymbol{x}\in \mathbb R^n$, and $f$ is locally Lipschitz with respect to $\boldsymbol{x}$, i.e., for every compact $K\subset D$, there exists an $L_K>0$, such that, for every $(t,\boldsymbol{x}),(t,\boldsymbol{x}_2)\in K$, $$ \lvert\,\boldsymbol{f}(t,\boldsymbol{x}_1)-\boldsymbol{f}(t,\boldsymbol{x}_2)\rvert \le L_K\lvert\boldsymbol{x}_1-\boldsymbol{x}_2\rvert, $$ then the IVP $$\,\boldsymbol{x}'=\boldsymbol{f}(t,\boldsymbol{x}), \quad\,\boldsymbol{x}(\tau)=\boldsymbol{\xi},\,$$ possesses a unique solution for every $(\tau,\boldsymbol{\xi})\in D$.

Fact II. Lipschitz condition is not necessary for uniqueness.

Take for example the IVP $$ x'=f(x), \quad x(\tau)=\xi,\tag{1} $$ where $f:\mathbb R\to\mathbb R$ is just continuous, and positive, i.e., $f(x)>0$, for all $x\in\mathbb R$. Then $(1)$ enjoys uniqueness for all $(\tau,\xi)\in\mathbb R$. To see this define $$ F(x)=\tau+\int_\xi^x\frac{ds}{f(s)}. $$ Then $F:\mathbb R\to (A_-,A_+)$ is one-to-one and onto, where $A_\pm=\lim_{x\to\pm\infty}F(x)$. Also $F$ is continuously differentiable and strictly increasing as $F'(x)>0$, and hence $F$ possesses a continuously differentiable inverse $\varphi : (A_-,A_+)\to\mathbb R$. Clearly, $\varphi(\tau)=F^{-1}(\tau)=\xi$ and $$ \varphi'(t)=\frac{1}{F'\big(F^{-1}(t)\big)}=f\big(F^{-1}(t)\big)=f\big(\varphi(t)\big) $$ Hence $\varphi$ is a solution of $(1)$. Let $\psi: I\to\mathbb R$ be another solution of $(1)$, where $I$ is an open interval containing $\tau$. Then $$ 1=\frac{\psi'(t)}{f\big(\psi(t)\big)}=\Big(F\big(\psi(t)\big)\Big)^{\!\prime}, \quad \text{for all $t\in I$}. $$ Thus $$ t+c=F\big(\psi(t)\big), $$ and for $t=\tau$, $$ \tau+c=F\big(\psi(\tau)\big)=F(\xi)=\tau. $$ Hence $c=0$ and thus $$ t=F\big(\psi(t)\big)=F\big(\varphi(t)\big), $$ and as $F$ is one-to-one, then $\varphi\equiv\psi$.

So, uniqueness is obtainable even without assuming Lipschitz condition!

Answer to Question 2. If uniqueness is violated, then there are infinitely many solutions. In fact, a continuum of solutions. This is a consequence of Helmut Kneser's Theorem, which can be found in Hartman's ODEs book, page 15.

A simple proof of the 1-dimensional case can be found here.

Solution 2:

Take

$$F(x,y) = \begin{cases} -y\log y &\text{for $y\in (0,1)$}\\ 0&\text{for $y=0$} \end{cases}$$

in any rectangle where it makes sense.

Then for $c\in (0,1)$ the problem $$\begin{cases} y' = F(x,y)\\ y(c) = 0\end{cases}$$ has exactly one solution. But $F$ is not Lipschitz.

The answer to your other two questions is: it depends of the IVP.

That means you can find problems with an infinity number of solutions, and there are problems with a finite number of solutions such as

$$\begin{cases} (f'(x))^2 = x\\ f(0) = 0 \end{cases}$$ say $x\in [0,1]$.

You can find sufficient conditions like under these circumstances, there are $n$ solutions, but I doubt something general can be said.

Solution 3:

Answer for $2$: If an IVP has two solutions and $f$ is continuous (not necessarily Lipschitz), then it has infinitely many solutions. I present a proof below, but the rough idea is as follows:

  • Suppose there are at least two solutions, therefore, there is a point $a\neq x_0$ and two solutions $y_1,y_2$ such that $y_1(a)\neq y_2(a)$.

  • Take any point $(a,b)$ on the segment $(a,y_1(a)), (a,y_2(a))$, and consider the IVP with initial condition $y(a)=b$.

  • If this new solution does not intersect other solutions, then its OK. If this solutions intersects another one, then note that at the intersection point we can choose either branch we want, and it will still be a solution (mean value theorem).

  • thus, for every point on the considered segment, we can find a solution, that can be extended to the whole maximal

Details follow.

Problem. Suppose $ f:\mathbb{R}^2 \to \Bbb{R}$ is continuous and $ t_0, x_0 \in \Bbb{R}$. Prove that if the Cauchy Problem $ \begin{cases} \dot{x}=f(t,x) \\ x(t_0)=x_0 \end{cases} $ has two distinct solutions then it has infinitely many solutions.

Proof: Without loss of generality we may assume that $ x_0=t_0=0$. Then there exist two solutions $ x_1,x_2$ of the Cauchy problem such that they are different in a point $ a$ which we may assume is greater than $ 0$. Therfore assume $ x_1(a) < x_2(a)$ and denote $ P(a,h)$ a point on the segment $ a \times (x_1(a),x_2(a))$ we can extend $ x$ towards $ 0$. From Cauchy's existence theorem, we can see that there exists a solution $ x$ around $ P$ of the differential equation $ \begin{cases} \dot{x}=f(t,x) \\ x(a)=h \end{cases}$. Denote by $ K$ the compact determined by $ x_1,x_2$ and the line $ x=a$. Since $ x$ is in that compact in a left neighborhood of $ a$, by the compact extension theorem, it can be extended until it reaches the boundary of $ K$. From the intersection point of the graph of $ x$ with the boundary of $ K$ we can go on the graph of $ x_1$ or $ x_2$ until we reach $ (0,0)$ and by the corollary of the mean value theorem the graph we choose is the graph of a solution to the initial differential equation.

Thus, we can find a solution $ x_h$ for every $ h \in (x_1(a),x_2(a))$, and therefore the intial equation has uncountably infinitely many solutions.