Is there any way to systematically do all epsilon delta proofs?
If you want to prove that the limit of $f(x)$ as $x$ to $a$ is equal to $L$ using the epsilon-delta definition of the limit, you need to solve the inequality
$$|f(x)-L|<\epsilon$$
for $x$, getting it into the form
$$|x-a|<\delta$$
for some $\delta$, which will in general be a function of $\epsilon$.
My question is, is there some way to calculate the function $\delta(\epsilon)$, short of solving the inequality above using the function $f$ you have?
Is it at least possible if $f$ is sufficiently well behaved? Like if $f$ is differentiable, can you calculate $\delta(\epsilon)$ using the derivative of $f$?
EDIT: This journal paper shows a formula for polynomials. If $f(x) = \sum_{n=0}^{k} a_n (x-a)^n$, then to prove that the limit of $f(x)$ as $x$ goes to $a$ equals $f(a)$, we can let $\delta = min(1,\frac{\epsilon}{ \sum_{n=1}^{k} |a_n|})$.
Can this be generalized to Taylor series? If $f(x) = \sum_{n=0}^{\infty} a_n (x-a)^n$, then can we prove that the limit of $f(x)$ as $x$ goes to $a$ equals $f(a)$ by letting $\delta = min(1,\frac{\epsilon}{ \sum_{n=1}^{\infty} |a_n|})$ ?
Below I deal with the power series question. I'll use your notation and assume WLOG that $a=0.$
Here's a simple solution to the general $\delta = \varphi(\epsilon)$ question that uses a different idea. Suppose the radius of convergence of the series is $r\in (0,\infty).$ Then
$$f'(x) = \sum_{n=1}^{\infty}na_nx^{n-1},\,\,|x|<r.$$
Define $D=\sum_{n=1}^{\infty}n|a_n|(r/2)^{n-1}.$ Then for $|x|<r/2,$ the mean value theorem gives
$$|f(x)-f(0)| = |f'(c_x)||x| \le D|x|.$$
Thus $\delta = \min(r/2,\epsilon/D)$ is a solution.
Note that since $r = 1/\limsup |a_n|^{1/n},$ we really do have a formula for $\delta $ as a function of $\epsilon$ that depends only on the coefficients $a_1,a_2, \dots.$ Note also that in the case $r=\infty,$ we can replace $r/2$ by $1$ in the above, and everything goes through.
Now to your specific question: Does $\delta = \min(1,\epsilon/(\sum_{n=1}^{\infty}|a_n|))$ work? The answer is yes, assuming $\sum|a_n| < \infty.$
Proof: Because $\sum|a_n| < \infty,$ the power series defining $f$ has radius of convergence at least $1.$ Let $\epsilon>0.$ Set $\delta = \min(1,\epsilon/(\sum_{n=1}^{\infty}|a_n|)).$ If $|x|<\delta,$ then
$$|f(x)-f(0)| = |\sum_{n=1}^{\infty}a_nx^n|\le \sum_{n=1}^{\infty}|a_n||x|^n$$ $$ = |x| \sum_{n=1}^{\infty}|a_n||x|^{n-1} \le |x| \sum_{n=1}^{\infty}|a_n| <\epsilon.$$
This result covers all cases where the radius of convergence is greater than $1.$ But obviously the result fails if $\sum|a_n| = \infty.$ Here we are in the case where the radius of convergence $r$ is a number in $(0,1].$ This can be handled by scaling into the $\sum|a_n| < \infty$ situation, and then scaling back. But the answer isn't as simple in this case. Since Micah's answer already covers this argument, I'll omit it here. (Note that the first method I mentioned, involving $f'(x),$ does not require this scaling argument.)
Suppose $f$ can be written as a power series around $0$: that is, $f(x)=\sum_{i=0}^\infty a_n x^n$ for some sequence $\{a_n\}$. We'll examine the continuity of $f$ at zero. (Of course, you could shift the power series to some other point and this analysis would apply there as well.)
We'll also start by assuming that $f$ has a radius of convergence which is strictly greater than $1$: this implies that $\sum |a_n|$ is convergent. Later on we'll remove this assumption. Let $P_n$ be the $n$th partial sum of the series. Fix $\epsilon>0$ and let $\delta_n=\min\left(1, \frac{\epsilon/2}{\sum_{i=1}^n |a_i|}\right)$. Then, by the linked paper, if $|x|<\delta_n$, then $|P_n(x)-a_0|<\epsilon/2$.
Now, take $\delta=\min\left(1, \frac{\epsilon/2}{\sum_{i=1}^\infty |a_i|}\right)$. Then $\delta \leq \delta_n$ for all $n$. So, if $x<\delta$, then $|P_n(x)-a_0|<\epsilon/2$ for all $n$: that is, $P_n(x)$ lies in the open $(\epsilon/2)$-ball around $a_0$ for all $n$. Since $\lim_{n \to \infty} P_n(x)=f(x)$, it follows that $f(x)$ lies in the closure of that ball. That is, we have $|f(x)-a_0|\leq \epsilon/2 < \epsilon$ whenever $|x|<\delta$. So, for any $\epsilon>0$, we can do our $\epsilon$-$\delta$ proof with $\delta=\min\left(1,\frac{\epsilon/2}{\sum_{i=1}^\infty |a_i|}\right)$.
This works when $f$ has a large enough radius of convergence, but what about the general case? In general, to say that $f$ can be written as a power series around $0$ is to say that it has some positive radius of convergence. That is, $R=\frac{1}{\limsup (a_k^{1/k})}$ is positive. Fix some $r<R$ (for definiteness, we could take $r=R/2$).
Now, let $$g(x)=f(x/r)=\sum_{i=0}^\infty \left(\frac{a_n}{r^n}\right)x^n$$ This is a power series with a radius of convergence $R/r$, which is strictly greater than $1$, and so we can apply our previous result to $g$. That is, given any $\epsilon>0$, let $\delta_g=\min\left(1,\frac{\epsilon/2}{\sum_{i=1}^\infty |a_i|/r^i}\right)$. Then, if $|x|<\delta_g$, $|g(x)-a_0|<\epsilon$.
Now, let $\delta=r\delta_g$. If $|x|<\delta$, then $x/r<\delta_g$, and so $|f(x)-a_0|=|g(x/r)-a_0|<\epsilon$. It follows that, for any $f$ which can be written as a convergent power series in a neighborhood of $0$, we can do our $\epsilon$-$\delta$ proof with $\delta=r\delta_g=r\min\left(1,\frac{\epsilon/2}{\sum_{i=1}^\infty |a_i|/r^i}\right)$.
This answers the question in your edit. In all fairness I should say that I don't think it does a very good job of answering your initial question: being equal to a convergent power series in the neighborhood of a point is a highly restrictive property! (I actually think the deleted answer, which works for any continuously differentiable function, is in many ways superior to this one...)