Difference between continuity and uniform continuity
Solution 1:
First of all, continuity is defined at a point $c$, whereas uniform continuity is defined on a set $A$. That makes a big difference. But your interpretation is rather correct: the point $c$ is part of the data, and is kept fixed as, for instance, $f$ itself. Roughly speaking, uniform continuity requires the existence of a single $\delta>0$ that works for the whole set $A$, and not near the single point $c$.
Solution 2:
The difference is in the ordering of the quantifiers.
- Continuity:
For all $x$, for all $\varepsilon$, there exist such a $\delta$ that something something.
- Uniform continuity:
For all $\varepsilon$, there exists such a $\delta$ that for all $x$ something something.
For something to be continuous, you can check "one $x$ at a time", so for each $x$, you pick a $\varepsilon$ and then find some $\delta$ that depends on both $x$ and $\varepsilon$ so that $|f(x)-f(y)|<\varepsilon$ if $|x-y|<\delta$. As you can see if you try it on $f(x)=1/x$ on $(0,1)$, you can find such a $\delta$ for every $x$ and $\varepsilon$. However, if you fix $\varepsilon$, the values for $\delta$ that you need become arbitrarily small as $x$ approaches $0$.
If you want uniform continuity, you need to pick a $\varepsilon$, then find a $\delta$ which is good for ALL the $x$ values you might have. As you see, for $f(x)=1/x$, such a $\delta$ does not exist.