I came across a very nice STEP question - Q8, STEP 1, 2018.

It assumed the existence of function $\mathrm s(x)$ and $\mathrm c(x)$ with the properties that $\mathrm s(0)=0$, $\mathrm c(0)=1$, $\mathrm s'(x) = \mathrm c(x)^2$ and $\mathrm c'(x)=-\mathrm s(x)^2$.

This leads immediately to familiar formulae like $\mathrm s(x)^3+\mathrm c(x)^3 \equiv 1$.

These hypothetical functions can then be used, for example, to show that $$ \int \frac{\mathrm du}{(1-u^3)^{4/3}} = \frac{u}{(1-u^3)^{1/3}}+K$$

These hypothetical functions can also be used, for example, to show that $$\int (1-u^3)^{1/3}~\mathrm du = \frac{1}{2}\mathrm s^{-1}(u)+\frac{1}{2}u(1-u^3)^{1/3} + K$$

I have a few questions:

  • Are their names for these functions $\mathrm s$ and $\mathrm c$?

  • How do we prove the existence/continuity/differentiability of such functions?

  • Are their function with $\mathrm s'(x) = \mathrm c(x)^{n-1}$, $\mathrm c'(x) = -\mathrm s(x)^{n-1}$

  • Do they have names?

  • How effective can it be to posit the existence of hypothetical functions which help to simplify integration?


Solution 1:

For any $1 \le n \in \Bbb N$ there exists a pair of real-valued functions $s_n(x)$, $c_n(x)$ on an open interval $I \subset \Bbb R$, with $0 \in I$, which satisfy

$s_n'(x) = c_n^{n - 1}(x), \tag 1$

$c_n'(x) = -s_n^{n - 1}(x), \tag 2$

$s_n(0) = 0, \; c_n(0) = 1; \tag 3$

before demonstrating the existence of such a pair for any $n \in \Bbb N$, we observe that (1) and (2) imply

$(s_n^n(x) + c_n^n(x))' = (n - 1)s_n^{n - 1}(x)s_n'(x) + (n - 1) c_n^{n - 1}c_n'(x)$ $= (n - 1)(s_n^{n - 1}c_n^{n - 1} - c_n^{n - 1} s_n^{n - 1}) = 0, \tag 4$

which together with (3) shows that $s_n^n(x) + c_n^n(x) = s_n^n(0) + c_n^n(x) = 1, \; x \in I, \tag 5$

which generalizes the well-known formula holding for the ordinary trigonometric functions $\sin$ and $\cos$,

$\sin^2 x + \cos^2 x = 1. \tag 6$

To see that such functions $s_n(x)$, $c_n(x)$ do indeed exist, we consider the planar vector fields $\mathbf G_n(u, v)$ defined by

$\mathbf G_n(u, v) = \begin{pmatrix} v^{n - 1} \\ -u^{n - 1} \end{pmatrix}; \tag 7$

setting

$\mathbf r = \begin{pmatrix} u \\ v \end{pmatrix}, \tag 8$

for each $n \in \Bbb N$ we have the autonomous ordinary differential equation

$\dot{\mathbf r} = \begin{pmatrix} \dot u \\ \dot v \end{pmatrix} = \mathbf G_n(u, v) = \begin{pmatrix} v^{n - 1} \\ -u^{n - 1} \end{pmatrix}; \tag 9$

the vector function $\mathbf G_n(u, v)$ occurring in (9) is in fact infinitely continuously differentiable, analytic even, and thus it is everywhere locally Lipschitz, and thus by the Picard-Lindeloef theorem there is a unique local solution through any point $(u_0, v_0)$; these solutions will of course be differentiable, analytic even, since these properties are inherited by solutions from the defining vector field $\mathbf G_n(u, v)$.

We momentarily restrict our attention to that unique solution $(u_n(t), v_n(t))$ of

$\dot {\mathbf r} = \mathbf G_n(u, v) \tag{10}$

which is initialized at $x = 0$ to

$u_n(0) = 0, \; v_n(0) = 1; \tag{11}$

we see by comparison of (1)-(3) with (9), (11) that uniqueness of solutions implies

$s_n(x) = u_n(x), \; c_n(x) = v_n(x), \tag{12}$

so it appears that the functions $s_n(x)$, $c_n(x)$ are the components of the integral curve of (10) which is initialized at

$\mathbf r(0) = \begin{pmatrix} 0 \\ 1 \end{pmatrix}. \tag{13}$

The preceeding discussion covers the existence, uniqueness, and differentiability properties of the functions $s_n(x)$, $c_n(x)$ for $n \ge 1$.

I'm not really sure how these functions are named in the larger world of mathematics. The best thing I have been able to come up with is Dixon's Elliptic Functions; however, this term may only technically refer to the case $n = 3$; but we've got to call these functions something-or-other, so perhaps extending the terminology to all the $s_n(x)$, $c_n(x)$ is apropos.

Nor can I say much about the utility of the $s_n(x)$, $c_n(x)$ in addressing the practical evaluation of integrals, something at which I am not in any event overly skilled; so I'll leave this subject to those with greater expertise than myself.

The functions $s_n(x)$, $c_n(x)$ in fact have many engaging properties other than those mentioned in the text of the question. For example, $\mathbf G_n(u, v)$ is in fact a Hamiltonian vector field, as one might suspect based upon our affirmation (4)-(5) that $(s_n(x))^n + (c_n(x))^n$ is conserved along trajectories; indeed, setting

$H_n(u, v) = \dfrac{1}{n}(u^n + v^n), \tag{14}$

we have

$\dfrac{\partial H_n}{\partial u} = u^{n - 1}, \tag{15}$

$\dfrac{\partial H_n}{\partial v} = v^{n - 1}; \tag{16}$

then

$\mathbf G_n = \begin{pmatrix} \dfrac{\partial H_n}{\partial v} \\ -\dfrac{\partial H_n}{\partial u} \end{pmatrix}, \tag{17}$

which expresses the vector field $\mathbf G_n(u, v)$ in Hamiltonian form. It is now easy to see that the trajectories of he system, being the level sets of $H_n(u, v)$, are compact for even $n$; the system is sort of a "generalized harmonic oscillator" in this sense, having orbits which are simple closed curves symmetrically surrounding the origin which is a critical point. When $n$ is odd, the level sets of $H_n(u, v)$ are not compact but are instead unbounded; the curious readers my sketch phase portraits for themselves to gain further insight into the flow of $\mathbf G_n$; we note that $(0, 0)$ is a critical point unless $n = 1$, in which case the Hamiltonian surfaces are straight lines of the form

$H_n = u + v, \tag{18}$

and

$\mathbf G_1 = \begin{pmatrix} 1 \\ -1 \end{pmatrix} \tag{19}$

is a constant vector field; we have $s_1(x) + c_1(x) = \text{constant}$, and there are no critical points.

Well, having presented a fairly thorough invitation to further explore these generalizations of $\sin x$ and $\cos x$, I'm going to call it a night.

Solution 2:

These are called generalized trigonometric functions and they include the functions you ask about. In this specific case, one can manufacture $s$ and $c$ on an interval, and then extend them using symmetry and making them periodic to define them on all of $\Bbb R$. Namely, put$$S(x)=\int_0^x (1-u^n)^{-(n-1)/n}\text{d}u$$ And notice $S,S'> 0$ for $x\in(0,1)$ hence $S$ has a differentiable, positive inverse $s(x)$, which, by FTC and Chain Rule, satisfies $s'(x)=(1-s(x)^n)^{(n-1)/n}$. If we define $c(x)=s'(x)^{1/(n-1)}$ then we get $s'(x)=c(x)^{n-1}$ and $s(x)^n+c(x)^n=1$ for free, as well as $c'(x)=-\frac 1n(1-s(x)^n)^{1/n-1}ns(x)^{n-1}s'(x)=-s(x)^{n-1}$.

As to how effective they can be in integration I am not sure as there may be useful cases I am not aware of. However, one quick example is to replace the $3$ in the STEP question with $n$ and the $4$ with $2(n-1)$ and then apply the same method used to solve the original problem (which I suspect was intended to be a generalization of the $\cot'(x)$ formula).