Determine $a \in \mathbb{R}$ such that $f_a(x)=\frac{1}{x-\log x+a}$ is defined for any $x>0$

Problem: determine for which $a \in \mathbb{R}$ the function $f_a(x)=\frac{1}{x-\log x+a}$ is defined for any $x>0$.

I have tried a solution of mine (for which I gently ask a feedback from you) and I have a question about the solution of my textbook.

My attempt: since by hypothesis $x>0$, $\log x$ is well defined hence $f_a$ is defined for any $x>0$ if $x-\log x+a\ne0$.

Define $g:(0,\infty) \to \mathbb{R}$ by $g(x)=x-\log x$: $g$ is differentiable since it is difference of two differentiable functions, hence we can calculate $g'(x)=1-\frac{1}{x}$ and deduce that $g$ attains its absolute minimum at $x_0=1$, and it is $g(1)=1$. Moreover, $g(x) \to \infty$ as $x \to \infty$ and, being continuous, for the intermediate value theorem it follows that $g \left((0,\infty)\right)=[1,\infty)$.

Hence $g(x)+a=x-\log x+a \ge 1+a$ for any $x>0$, so $f_a$ is surely defined for any $x>0$ if $a>-1$.

Finally, the intermediate value theorem assures that for any $\lambda \in [1,\infty)$ there exists $x_\lambda \in (0,\infty)$ such that $g(x_\lambda)=\lambda$: hence there exists $x_\lambda \in (0,\infty)$ such that $g(x_\lambda)-\lambda=0$ with $-\lambda \in (-\infty,-1]$ and so $f_a$ is not defined for any $x>0$ if $a \le -1$ because it is not defined at least for $x_\lambda \in (0,\infty)$. Is my solution correct?

My textbook solves this problem saying this: "The condition $x-\log x +a \ne 0$ for any $x>0$ is equivalent to $x-\log x+a>0$ for any $x>0$ because $x-\log x \to \infty$ as $x \to \infty$." without further explanation. My question is: does this follow from the fact that $\forall x>0, x-\log x+a \ne 0 \iff \forall x>0,(x-\log x+a<0) \vee (x-\log x+a>0)$ and, since $x-\log x \to \infty$ as $x \to \infty$ is true, then is true that for any $M>0$ there exists $x_M>0$ such that $x \ge x_M \implies x-\log x>M$ and so, being $M>0$ arbitrary, I can use the definition of limit with $K-a$ instead of $M$ (with $K>0$) to deduce that there exists $x^*:=x_{K-a}$ such that $$[x \ge x^* \implies x-\log x>K-a] \implies [x-\log x+a>K>0] \implies x-\log x+a>0$$ And so it can't happen that $x-\log x +a<0$ for any $x>0$ because it doesn't hold for $x \ge x^*$, and since the disjunction $p \vee q$ is true when $p$ or $q$ or both are true then when one is surely false the disjunction is logically equivalent to the other one that we still don't know if it is true or false?


Your solution is correct, but you should clean it up. You already argue that $g((0,\infty)) = [1, \infty)$, but then go back and argue that same thing again.

As for the book's solution, they are just applying the Intermediate Value theorem themselves. Since $x - \log x + a$ is continuous and is eventually positive, if it were ever negative, then somewhere between where it is negative and where it becomes positive, it would have to pass through $0$. So if $x - \log x + a \ne 0$ for any $x$, it has to be positive everywhere.