Finding the Best linear unbiased estimator of the mean using Lagrange Multiplier

Best linear unbiased estimator of the mean

Let $X_1$, $X_2$, ..., $X_n$ be independent random variables with expectaions $\mu \in \mathbb{R}$ and known standard deviations $0 < \sigma_i < +\infty$. We consider the following estimator: $$ \hat{\mu}:=\sum_{i = 1}^nw_iX_i $$ of $\mu$, with given weights $w_i \in \mathbb{R}$. A requirement on $\textbf{w} = (w_i)_{i = 1}^n$ is that $g(\textbf{w}):= \sum_{i = 1}^nw_i = 1$, ensuring that $\mathbb{E} \hat{\mu} = \mu$ (unbiasedness). Under the latter constraint, we would like to minimize the mean quadratic error, $\mathbb{E}(( \hat{\mu} - \mu)^2) = f(\textbf{w}):=\sum_{i = 1}^nw_i^2\sigma_i^2$. We now consider, for an arbitrary $\lambda \in \mathbb{R}$ $$ L(\textbf{w}, \lambda) = f(\textbf{w}) + \lambda g(\textbf{w}) = \sum_{i = 1}^n (w_i^2\sigma_i^2 + \lambda w_i) $$ As a function of $\textbf{w}$, $L(\cdot, \lambda)$ can be minimized in a coordinated-wise manner, and one obtains the unique minimizer: $$ \textbf{w}_{\lambda} := \Big(\frac{-\lambda}{2\sigma_i^2}\Big)_{i = 1}^n \quad (\star) $$ The condition $g(\textbf{w}) = 1$ is fulfilled exactly when $\lambda = -2C$, with $$ C := \Big(\sum_{i = 1}^n\frac{1}{\sigma_i^2}\Big)^{-1} $$ The optimal weights are thus given by $w_i := \frac{C}{\sigma_i^2}$, and the corresponding mean quadratic error equals $C$.

My question is:

How can you find the equation $(\star)$, namely $\textbf{w}_{\lambda} := \Big(\frac{-\lambda}{2\sigma_i^2}\Big)_{i = 1}^n$ with the given information?


The minimum of $L(\cdot,\lambda)$ can be found via the first-order condition $$ \frac{\partial L}{\partial w_i} = 2w_i \sigma^2_i + \lambda = 0. $$

Solving for $w_i$ gives us that $w_i = -\frac{\lambda}{2 \sigma_i^2}$.

Technically, you would need to do additional work (e.g. checking second-order conditions) to verify that this is, in fact, a minimum. I'll leave that verification to you.