when do gibbs phenomenon arise when solving PDE's

I have multiple questions concerning Gibbs phenomenon surrounding PDE's. I don't really understand what factors in a PDE will give rise to a Gibbs phenomenon.

I understand that it is a phenomenon that appears in jump discontinuities but nothing more.

given the following PDE defined on a rectangle of length $L$ and width $H$

$$c\frac{\partial u}{\partial x} = \alpha(\frac{\partial^2 u}{\partial x^2} + \frac{\partial^2}{\partial y^2})$$

$\alpha$ and $c$ being strictly positive

with the following boundary limits

$u(x,0) = 0$

$u(0,y) = 0$

$\frac{\partial u}{\partial y}(x,H) = 0$

$u(L,y) = U_0(2\frac{y}{H} - (\frac{y}{H})^2)$

with $U_0$ being a constant

from my brief understanding, I can tell that in this PDE there isn't Gibbs phenomenon but would like to have a more detailed answer.

If I could have more details on what factors of a PDE can help indicate if there is gibbs phenomenon or not would be greatly appreciated.


The Gibbs phenomenon is a theorem about Fourier series, and arises sometimes in PDE that can be solved by Fourier series. It doesn't depend on the PDE itself, but if you include boundary values that have discontinuities, and try to solve using a Fourier series, then you would expect the Gibbs theorem to affect the extent to which your series converges.

To understand Gibbs, it is useful to see that there are several ways to define convergence of a series of functions to some limit function.

  1. The partial sums, evaluated at each point of the domain, converge to the limit function value there. (pointwise convergence)

  2. The maximum difference between the partial sums and the function tends to zero. (uniform convergence)

  3. The integral of the square of the difference between the partial sums and the function tends to zero. (convergence in mean, or $L^2$ convergence)

For the Fourier partial sums of a periodic function $f$ we have at least the following:

Item (1) almost holds if $f$ is piecewise differentiable. The extra requirement is that at jumps in $f$ you get the average of the left and right limits of $f$.

Item (2) holds if $f$ is differentiable.

Item (3) holds no matter how rough $f$ is, as long as $\int f^2 dx$ exists and is finite.

Finally, the Gibbs theorem is that item (2) does not hold when $f$ is piecewise differentiable, and in particular the maximum difference (the ``overshoot") remains more than ten percent of the jump amount no matter how many terms you take.

Applying this information to a PDE: if you can solve your PDE by separation of variables resulting in a Fourier series, and a boundary value has a jump discontinuity, then the Gibbs phenomenon comes into play. Suppose you want to use your series at a point near the boundary, and you want to use say, 50 terms or less to approximate the solution, how accurate is it? the answer is: you don't know-- even if you know that a solution exists, your partial sum might be more than ten percent in error.