There are many Runge–Kutta methods. The one you have described is (probably) the most popular and widely used one. I am not going to show you how to derive this particular method – instead I will derive the general formula for the explicit second-order Runge–Kutta methods and you can generalise the ideas.

In what follows we will need the following Taylor series expansions.

\begin{align} \tag1\label{eq1} & x(t+\Delta t) = x(t) + (\Delta t)x'(t) + \frac{(\Delta t)^{2}}{2!}x''(t) + \text{higher order terms}. \\ \tag2\label{eq2} & f(t + \Delta t, x + \Delta x) = f(t,x) + (\Delta t)f_{t}(t,x) + (\Delta x)f_{x}(t,x) + \text{higher order terms}. \end{align}

We are interested in the following ODE:

$$x'(t) = f(t,x(t)).$$

The value $x(t)$ is known and $x(t+h)$ is desired. We can solve this ODE by integrating: $$x(t+h) = x(t) + \int_{t}^{t+h}f(\tau,x(\tau))\text{d}\tau.$$ Unfortunately, actually doing that integration exactly is often quite hard (or impossible), so we approximate it using quadrature: $$x(t+h) \approx x(t) + h\sum_{i=1}^{N}\omega_{i}f(t + \nu_{i}h,x(t + \nu_{i}h)).$$ The accuracy of the quadrature depends on the number of terms in the summation (the order of the Runge–Kutta method), the weights, $\omega_{i}$, and the position of the nodes, $\nu_{i}$.

Even this quadrature can be quite difficult to calculate, since on the right-hand side we need $x(t + \nu_{i}h)$, which we don’t know. We get around this problem in the following manner:

Let $\nu_1=0$, which makes the first term in the quadrature $K_{1} := hf(t,x(t)).$ This we do know and we can also use it to approximate $x(t + \nu_{2}h)$ by writing the second term in the quadrature as $K_{2} := hf(t+\alpha h,x(t) + \beta K_{1})$.

With this, the quadrature formula is: $$\tag3\label{eq3} x(t+h) = x(t) + \omega_{1}K_{1} + \omega_{2}K_{2}.$$

Some notes:

  1. If we wanted to find a third-order method, we would introduce $K_{3} = hf(t+\tilde{\alpha}h,x+\tilde{\beta}K_{1} + \tilde{\gamma}K_{2})$. If we wanted a fourth-order method, we would introduce $K_{4}$ in a similar way.
  2. This is an explicit method, i.e. we have chosen $K_{2}$ to depend on $K_{1}$ but $K_{1}$ does not depend on $K_{2}$. Similarly, $K_{3}$ (if we introduce it) depends on $K_{1}$ and $K_{2}$ but they do not depend on $K_{3}$. We could allow the dependence to run both ways but then the method is implicit and becomes much harder to solve.
  3. We still need to choose $\alpha$, $\beta$ and the $\omega_{i}$. We will do that now using the Taylor series expansions I introduced at the very beginning.

In equation \eqref{eq3}, we substitute in the Taylor series expansion \eqref{eq1} on the left-hand side:

$$x(t) + h x'(t) + \frac{h^{2}}{2!}x''(t) + \mathcal{O}\left(h^{3}\right) = x(t) + \omega_{1}K_{1} + \omega_{2}K_{2}.$$

Since $x' = f$ and thus $x'' = f_{t} + ff_{x}$ (suppressing arguments for ease of notation), we have:

$$hf + \frac{h^{2}}{2}(f_{t} + ff_{x}) + \mathcal{O}\left(h^{3}\right) = \omega_{1}K_{1} + \omega_{2}K_{2}.$$

Now substitute for $K_{1}$ and $K_{2}$:

$$hf + \frac{h^{2}}{2}(f_{t} + ff_{x}) + \mathcal{O}\left(h^{3}\right) = \omega_{1}hf + \omega_{2}hf(t + \alpha h, x + \beta K_{1}).$$

Taylor-expand on the right-hand side using \eqref{eq2}:

$$hf + \frac{h^{2}}{2}(f_{t} + ff_{x}) + \mathcal{O}\left(h^{3}\right) = \omega_{1}hf + \omega_{2}(hf +\alpha h^{2}f_{t} + \beta h^{2} ff_{x}) + \mathcal{O}\left(h^{3}\right).$$

Thus the Runge–Kutta method will agree with the Taylor series approximation to $\mathcal{O}\left(h^{3}\right)$ if we choose:

$$\omega_{1} + \omega_{2} = 1,$$ $$\alpha \omega_{2} = \frac{1}{2},$$ $$\beta \omega_{2} = \frac{1}{2}.$$

The canonical choice for the second-order Runge–Kutta methods is $\alpha = \beta = 1$ and $\omega_{1} = \omega_{2} = 1/2.$

The same procedure can be used to find constraints on the parameters of the fourth-order Runge–Kutta methods. The canonical choice in that case is the method you described in your question.