The proof doesn't use that $\int_{}f(x)g(x)~dx = \int_{}f(x)dx \int_{}g(x)dx$ (which is wrong), but rather $$\int_{}\int_{}f(x)g(y)~dxdy = \int_{}g(y) \int_{}f(x)dx ~dy = \int_{}f(x)~dx\int_{}g(y)dy.$$ Note that this follows simply from the property that $\int_{}C f(x)dx = C\int_{}f(x)dx$, where in the first equality we set $C = g(y)$ to pull out $g(y)$ from the inner integral as it does not depend on $x$, then in the second step we use that $C:= \int_{}f(x)dx$ does not depend on $y$ to pull it out of the integral with respect to $y$. By symmetry, this reasoning also shows that $$\int_{}\int_{}f(x)g(y)~dxdy = \int_{}\int_{}f(x)g(y)~dydx.$$ and therefore with $h(x,y) := f(x)g(y)$ it follows that $L(h) = L'(h)$. The general case with the product of $k$ functions is completely analogous and can be shown e.g. by induction. Now note that $L$ and $L'$ are linear operators, that is, it holds that $$L(\alpha h_1 + \beta h_2) = \alpha L(h_1) + \beta L(h_2), \quad L'(\alpha h_1 + \beta h_2) = \alpha L'(h_1) + \beta L'(h_2),$$ where $\alpha, \beta \in \mathbb{R}$ and $h_1,h_2$ are continuous functions. In particular, since $L(h) = L'(h)$ for any $h$ of product form, it follows that $L(g) = L'(g)$ for any $g \in \mathcal{A}$: consider for example the case that $g = \alpha h_1 + \beta h_2$, where $h_1,h_2$ are of product form. Then, the fact that $L(h_1) = L'(h_1)$ and $L(h_2) = L'(h_2)$ and the linearity yield \begin{align*} L(g) &= L(\alpha h_1 + \beta h_2) = \alpha L(h_1) + \beta L(h_2) = \alpha L'(h_1) + \beta L'(h_2) \\ &= L'(\alpha h_1 + \beta h_2) = L'(g). \end{align*}