To find $\mathbf{x}$ such that

$$A\mathbf{x}=\mathbf{b}$$

we can use least squares when the problem is not well posed. Further, we can use Tikhonov regularization when $A$ is ill-conditioned. In Tikhonov regularization, we minimize

$$\|A\mathbf{x}-\mathbf{b}\|_2^2+\|\Gamma\mathbf{x}\|_2^2$$

where $\Gamma$ determines the regularization properties. Alternatively, we could use the truncated SVD of $A$ to find the pseudoinverse. For SVD $A=U\Sigma V^T$, the truncated SVD is

$$U\Sigma_kV^T$$

where $\Sigma_k$ is composed of the first $k$ singular values. Truncating the SVD provides another means of regularization by producing solutions with smaller norms.

When is Tikhonov regularization similar (or even the same) as using the truncated SVD?


Consider the SVD of the $N\times N$ matrix A to be, $$A = U\Sigma V^T$$ where U and V are orthogonal matrices, and $\Sigma$ is a diagonal matrix with entries $$\sigma_1 \geq \sigma_2 \geq \cdots \geq \sigma_N \geq 0$$ For a ill-conditioned matrix $A$, all the singular values decay gradually to zero and the condition number, i.e.

$cond(A)= \frac{\sigma_1}{\sigma_N}$ is very large.

The SVD approach (a.k.a spectral filtering) is to damp the effects caused by division by the small singular values. $$x_{naive} = A^{-1}b= V\Sigma^{-1}U^Tb = \sum_{i=1}^{N}\frac{u_i^Tb}{\sigma_i}v_i $$

The TSVD method is an example of the general class of methods that are called spectral filtering methods, which have form, $$x_{filt} = \sum_{i=1}^{N}\phi_i\frac{u_i^Tb}{\sigma_i}v_i $$ where the filter factors $\phi_i$ are chosen such that $\phi_i \approx 1$ for large singular values, and $\phi_i \approx 0$ for small singular values.

Now look at the filter equations,

$\bf{The~TSVD~Method}$ $$\phi_i = 1, i = 1,\cdots, k $$ $$= 0, otherwise$$ The parameter $k < N$ is called the truncation parameter.

$\bf{The~Tikhonov~Method}$ $$\phi_i = \frac{\sigma_i^2}{\sigma_i^2 + \alpha^2}, i = 1,\cdots, N $$ The parameter $\alpha > 0$ is called the regularization parameter. This choice of filter factors yields the solution vector $x_{\alpha}$ for the minimization problem, $min_x { ||b-Ax||^2_2 + \alpha^2 ||x||_2^2}$.

To answer your question, "when Tikhonov regularization becomes similar(or equal) to TSVD", we can see that as $\alpha \rightarrow 0$, $\phi_i \rightarrow 1$ which are the filter coefficients, and the Tikhonov method becomes similar to TSVD. You can think of this filtering as, TSVD uses a filter with a sharp jump from 0 to 1 and Tikhonov using a smoother approach (it does prevent oscillations to the solution.) For more details see Spectra and Filtering.