The difference between convergence in $L^{\infty}$ and almost uniformly
Solution 1:
Yes, that is right.
It is a good exercise to prove that your definition of $L^\infty$ convergence is equivalent to:
There is a set $E \subset \mathcal{M}$ with $\mu(E) = 0$ such that $f_n \to f$ uniformly on $E^c$.
The thing to notice is that your definition should be expanded as:
For all $\epsilon > 0$ there exists $N$ such that for all $n \ge N$ there exists $F \in \mathcal{M}$ such that $\mu(F) = 0$ and $|f_n(x) - f(x)| \le \epsilon$ for all $x \in F^c$.
That is, the set $F$ may depend on $\epsilon$ and $n$. In showing the equivalence with my statement, you have to find a single set $F$ that works for all $\epsilon,n$.
Solution 2:
Here's an example to see the difference:
Consider the sequence of functions $f_n$, where $f_n(x)= 1$, whenever $\frac{-1}{n}<x<\frac{1}{n}$ and zero otherwise. These functions are clearly measurable. Also, the point-wise limit is the function $f$ which is zero on $R \backslash\{0\}$ and $f(0)=1$. Finally, one can show that that this converges almost uniformly (By chopping of small intervals centered around $0$), but not in $L^\infty$(Since we can do no such chopping).
Since I am in a hurry, I could only give a sketch. Perhaps I could try and clarify any doubts you have in the comments?