Are there any situations in which L'Hopital's Rule WILL NOT work?
Solution 1:
Consider the function $f(x)=e^{-\frac{1}{x^2}}$, for all $x\ne 0\in \mathbb R$, and $f(0)=0$ (or take any other function with the property that all derivatives at $0$ vanish, but the function is not locally constant at $0$). Now suppose you are asked to compute $\lim_{x\to 0}\frac{f(x)}{f(x)}$. Of course, this limit is $1$ by simply working out the fraction first, and then taking the limit. But if you try to use L'Hopitals' rule you find that the conditions are met, but $\lim_{x\to 0}\frac{f'(x)}{f'(x)}$ is still of indeterminate form. Again L'Hopitals is applicable, and again $\lim_{x\to 0}\frac{f''(x)}{f''(x)}$ is indeterminate. This will go on forever. So even though the limit can be determined, and even though the conditions for L'Hopitals rule are repeatedly met, you will never get the result this way.
Solution 2:
L'Hopital's rule fails if $$\lim_{x\to x_0} \frac f g \text{ exists but} \lim_{x\to x_0}\frac {f'}{g'} \text{doesn't}$$
e.g. $$\lim_{x\to \infty} \frac {x+\sin x}x.$$
I wonder if we could extend generalization of Stolz–Cesàro theorem to continuous case - i.e. if $$\liminf \frac {f'}{g'}\le \liminf \frac f g\le \limsup \frac f g\le \limsup \frac {f'}{g'}$$ for $x\to \infty$ in all limits and $\lim f=\lim g=\infty$.
Solution 3:
In a more subtle way, one often forgets the hypotheses: there must exist a neighbourhood $V$ of $a$ such that on $V\smallsetminus\{a\}$, neither $g(x)$ nor $g'(x)$ vanish.
Using Taylor's polynomial is much more secure. Or better, when possible, using equivalents of the numerator and denominator is more elegant.
Solution 4:
Our first step to better understand when L'Hospital's Rule is applicable, is to consider its formal statement. It is a theorem, therefore it is always correct; we just have to make sure that all needed assumptions/hypotheses do hold for the limit we are trying to evaluate. Let's take a look!
Theorem (L'Hospital's Rule for right-hand limits):
Let $f$, $g$ be functions:
Condition 1. That are differentiable on the interval $(a,b)$, where we allow the possibility $a = -\infty$ and/or $b = +\infty$
Condition 2. Such that the derivative of $g$ is never zero on that interval
Condition 3. Satisfying $\text{ }\text{ }\lim_{x \to a^+} f(x) = \lim_{x \to a^+} g(x) = 0\text{ }\text{ }$ or $\text{ }\text{ }\lim_{x \to a^+} g(x) = \pm \infty $
Condition 4. Satisfying $\text{ }\text{ }\lim_{x \to a^+} \dfrac{f'(x)}{g'(x)} = L\text{ }\text{ }$ (where we allow the possibility that $L$ is a real number, $+\infty$ or $-\infty$).
Then
$\text{ }\text{ }\lim_{x \to a^+} \dfrac{f(x)}{g(x)} = L\text{ }\text{ }$
For left-hand limits, replace all $x \to a^+$ occurences with $x \to b^-$.
For "normal" limits, just check that both parts of L'Hospital theorems apply and the left-hand limit found is equal to the right-hand limit found. This is correct because we know that a limit exists if and only if the lateral limits exist and are equal.
Seeing the formal statement of L'Hospital's theorem is very useful in understanding when it can be applied. Let's look again and make some remarks:
Remark 1. Make sure your function is differentiable on that interval $(a, b)$. If you are taking $x \to a^+$, for example, remember that you are free to choose $b$. You can choose $b$ as close to $a$ as you want to. The only problem here would be if your function is so weird that no matter how close you get to $a$, it is still not differentiable in that little interval.
Remark 2. After you found that interval $(a, b)$, make sure $g'$ is never zero inside that interval. Again, remember, if your interval didn't work, don't give up just yet, try to see if there is a smaller interval that fits your needs.
Remark 3. You can only apply if you've got $\frac{0}{0}$ or the denominator goes to infinity.
Remark 4. The last condition says that you must be able to compute $\text{ }\text{ }\lim_{x \to a^+} \dfrac{f'(x)}{g'(x)}\text{ }\text{ }$ somehow, maybe with another application of L'Hospital Rule.
Example
Let's consider you want to apply L'Hospital's Rule to compute $$\lim_{x \to 0} \dfrac{x}{\sin( x)}$$
Let's check the four conditions.
Condition 1. If we choose $a = 0$ and $b = 2\pi$, both $f$ and $g$ are differentiable in $(0,2\pi)$, so it is fine.
Condition 2. Unfortunately, the derivative of $g$ is zero for $p = \frac{\pi}{2}$, and this would violate Condition 2. Fortunately, we can avoid this problem by choosing a smaller interval, $a = 0$ and $b = \frac{\pi}{4}$, for example. Now, both functions are differentiable in the interval and the derivative of $g$ is never zero.
Condition 3. We have the first option, because $\text{ }\text{ }\lim_{x \to 0^+} f(x) = \lim_{x \to 0^+} g(x) = 0\text{ }\text{ }$
Condition 4. We have $\text{ }\text{ }\lim_{x \to 0^+} \dfrac{f'(x)}{g'(x)} = \lim_{x \to 0^+} \dfrac{1}{\cos(x)} = L\text{ }\text{ }$ where $L = 1 \in \mathbb{R}$
Therefore, all conditions are met and we conclude that
$$\text{ }\text{ }\lim_{x \to 0^+} \dfrac{x}{\sin(x)} = 1\text{ }\text{ }$$
Examples that won't work
Let's look at examples that break the conditions given. We are getting close to answer your question.
For condition 1. Take $f$ as the Triangle Wave Function, and you want to calculate the limit when $x \to \infty$. The problem here is that you can't choose any $a \in \mathbb{R}$ such that $f$ is differentiable in $(a,\infty)$, because there will always be a peak of the triangle wave inside that interval, and the function is not differentiable in that peak.
For condition 2. Take $g(x) = \sin(\frac{1}{x})$, and you want to calculate the limit when $x \to 0$. The problem here is that you can't choose any $b > 0$ such that $g'$ is never zero in $(0,b)$, because there will always be an infinite amount of sine peaks in any interval $(0,b)$ (look at a graph of $\sin(\frac{1}{x})$).
For condition 3. I challenge you to think of this example, shouldn't be too hard.
For condition 4. Steven Gubkin gave this example in his answer: consider $f(x) = x+\sin(x)$ and $g(x) = x$. The limit of $\frac{x + \sin(x)}{x}$ does exist when $x \to \infty$, but $\lim_{x \to \infty} \frac{1+\cos(x)}{1}$ does not exist.
TL;DR & Final Words
L'Hospital's Rule is a math theorem, therefore it is always correct. You just have to check if all hypothesis were satisfied. In the beginning of this answer, the four conditions for the Theorem to hold are given.
Pay attention to what the theorem actually says: given the needed hypotheses, if the limit of $\dfrac{f'(x)}{g'(x)}$ exists, then the limit of $\dfrac{f(x)}{g(x)}$ also exists and those limits are equal.
There are situations in which the theorem is applicable, and correct (of course), but useless, if the new limit is as hard (or even harder) to be calculated than the first one. Alex Zorn gave an excellent example of this in his answer:
$$\lim_{x \rightarrow 0}\frac{e^{-\frac{1}{x^2}}}{x}$$
L'Hopital's Theorem is applicable in this case, and is correct, but not useful because the new limit is as hard as the first one, and you didn't make any progress.
Just one last thing (fun fact). In your question, you said
Today was my first day learning L'Hopital's Rule, and I was wondering if there are any situations in which you cannot use this rule, with the exception of when a limit is determinable.
An interesting catch is that the exception you mentioned isn't really an exception. If the limit is easily determinable, but still falls into the theorem's conditions, the theorem is still correct, for example:
$$\lim_{x \to \infty}\frac{1}{x}$$
The functions $f(x) = 1$ and $g(x) = x$ fit all four conditions in any interval $(a, \infty)$, for any $a > 0$, therefore you can apply L'Hospital's Rule to it, and get
$$\lim_{x \to \infty}\frac{1}{x} = \lim_{x \to \infty}\frac{0}{1} = 0$$
By the way, this kind of analysis is made in courses of Real Analysis, in case you are interested in learning more, I strongly recommend you take such course.
Solution 5:
Here is an example of why you should be careful:
$ \begin{align*} \lim_{x \to \infty} \frac{x+\sin(x)}{x} &= \lim_{x \to \infty} \frac{\frac{d}{d x}(x+\sin(x))}{\frac{d}{dx}(x)}\\ &= \lim_{x \to \infty} \frac{1+\cos(x)}{1}\\ \end{align*} $
This last limit does not exist, so you might conclude that the original limit does not either.
In fact,
$ \begin{align*} \lim_{x \to \infty} \frac{x+\sin(x)}{x} &= \lim_{x \to \infty} 1+ \frac{\sin(x)}{x}\\ &= 1 \end{align*} $
since you can use the squeeze theorem to prove $\lim_{x \to \infty} \frac{\sin(x)}{x} = 0$.
What happened?
One of the hypotheses of l'Hopital's rule is that $\lim \frac{f'}{g'}$ exists.