Is there a slowest divergent function?
So I've been playing around with some functions for a while, and started wondering about a slowest divergent function(as in $\lim_{x\to\infty} f(x)\to\infty$) and so I searched around for an answer.
I can see that there are ways to construct a new function that is necessarily diverging slower than the original one. But then it struck me that it really feels like properties of an open set where there are no smallest element in $(0,\infty)$.
So the question is this, is it possible to define recursively a function $f(x)$ such that $\lim_{x\to\infty} f(x)\to\infty$ and $$ (\forall g(x) \neq f(x), \lim_{x\to\infty} g(x)\to\infty) \lim_{x\to\infty} {g(x)\over f(x)}\to\infty $$ For an example of a function defined recursively, consider $$ f(x)={x^{1\over f(x)}\over ln(x)} $$ which I have no idea how it behaves.
The reason I'm stressing the recursion is beacuse despite having no smallest element on $(0,\infty)$ , elements can get arbitrarily close to the endpoints, and to do that with a function, I'm guessing recursion is the way to go.
No such $f$ exists. If $f$ is any function such that $\lim_{x \to \infty} f(x) = \infty$, then consider the function $\log(f)$. We have that $\lim_{x \to \infty} \log(f(x)) = \infty$. But the function $\log(f)$ diverges slower than $f$, because we have that $\lim_{x \to \infty} \frac{\log(f(x))}{f(x)}=0$.
Now your idea of recursion brings up another interesting point. We can recursively define a sequence of functions $f_n$ as follows: $f_0(x)=x$, and $f_{n+1}(x)=\log f_n(x)$. Then $f_{n+1}$ always diverges slower than $f_n$. Then one may ask the question: for any function $f$ such that $\lim_{x \to \infty} f(x) = \infty$, does there exist $n \in \mathbb{N}$ such that $f_n$ diverges slower than $f$??
The answer is still no. To prove this, we'll construct a function $f$ which diverges slower than all of the $f_n$. Let $a_1=1$. Suppose $a_1 < \cdots < a_n$ have been constructed so that $f_{i+1}(a_{i+1})>\frac{1}{i}+f_i(a_i)$ for all $1 \leq i \leq n-1$. Then choose $a_{n+1}>a_n$ such that $f_{n+1}(a_{n+1})>\frac{1}{n}+f_n(a_n)$ [such $a_{n+1}$ exists since $f_{n+1}$ diverges]. Continuing inductively, we get a sequence $(a_n)$ such that $f_n(a_n) \to \infty$. Define a function $f$ by interpolating all of the points $(a_n, f_n(a_n))$. Then you can check that $\lim_{x \to \infty} f(x)= \infty$, but $f$ diverges slower than all of the $f_n$.
We can do a similar construction to show that for any countable collection of divergent functions, we can find a function which diverges slower than all of these (which is stronger than your original question, pertaining to just one function).
What you call a "recursive definition" isn't a really definition, it's a formula which may be satisfied by one, several, or no functions at all.
Your question seems a little contradictory - you say in the second paragraph that you know no slowest function exists since you can always construct a slower one, and then immediately go on to ask for a slowest function regardless. The fact that you have a novel method of defining a function doesn't change the fact that no such slowest function exists, as you said in your first paragraph. The fact that you can always construct a slower function means that you don't even need to consider new methods of defining your function, because the operating word is always. It would be a bit like if you proved for me that no rational number $a$ satisfies $a^2=2$, and I said "okay, but what if I used a really big denominator?". You would simply reply, "I said no rational number".
On the other hand, as discussed in Shalop's answer, if you're willing to relax your definition of a "slowest" function, then the idea of defining a sequence of functions via some recurrence relation may yield some interesting results. A very simple such relation would be $f_{n+1} = \sqrt[3] {f_n}$. As long as $f_0$ diverges then so do all $f_n$, and each converges more slowly than the last.