Rate of convergence of modified Newton's method for multiple roots
If $k=1$, you have a simple root, your iteration reduces to Newton's method and we know that in that case, Newton's method converges quadratically.
If $k > 1$, you have a multiple root and if $f$ has a root of multiplicity $k$ at $r$, it can be written in the form $f(x) = (x-r)^k h(x)$ where $h(r) \neq 0$.
Now your iteration is a fixed-point iteration of the form $x_{k+1} = g(x_k)$ where $g(x) := x - k f(x)/f'(x)$. Using the expression of $f$ above, you can check that $g(r) = r$, which is why $g$ is called a fixed-point function---it does not move $r$---and $r$ is called a fixed point of $g$. The thing with fixed-point iterations is that they converge (to a fixed point of $g$) provided $|g'(r)| < 1$ and you initialize them sufficiently close to $r$. If you denote $e_n := (x_n -r)$ the error at the $n$-th iteration, then, using a Taylor expansion, you get $$ e_{n+1} = x_{n+1} - r = g(x_n) - r \approx g(r) + g'(r) (x_n - r) - r = g'(r) e_n $$ because $g(r) = r$. So essentially, the error is reduced by a factor $|g'(r)|$ at each iteration when you're sufficiently close to $r$. If $g'(r) \neq 0$, this is called linear convergence. But what if $g'(r) = 0$? You just perform a second-order expansion instead of a first-order expansion and you get $$ e_{n+1} \approx g(r) + g'(r) e_n + \tfrac{1}{2} g''(r) e_n^2 -r = \tfrac{1}{2} g''(r) e_n^2, $$ and now the error is (roughly) squared at each iteration. This is called quadratic convergence. (If $g''(r)=0$, you go to third order, etc.)
Why am I explaining all this? If you want to show that your modified Newton iteration converges quadratically, you can try to show that $g(r)=r$ and $g'(r) = 0$.
It's exactly the case with your iteration and it's relatively easy if you write $f(x) = (x-r)^k h(x)$. Compute $f'(x)$, $f''(x)$, $g'(x)$, simplify, and then plug $x=r$. You'll see that $g'(r)=0$.