When $f$ is a continuous function on the interval $[a,b]$, we can find a function $F$ defined on $[a,b]$ such that $F'(x)=f(x)$ for all $x\in[a,b]$. This is the “fundamental theorem of calculus”; just consider $$ F(x)=\int_{a}^{x} f(t)\,dt $$ There are other functions with the same property, precisely those of the form $F(x)+c$ where $c$ is a completely arbitrary constant.

Sometimes this function can be expressed with the so-called “elementary functions”, that is, polynomials, rational functions, exponential, logarithm, trigonometric functions and any algebraic combination thereof. Some (actually many) functions do not admit an antiderivative expressible in this form; it's the case of $e^{-x^2}$ and it can be proved, although not easily.

Think of a simpler example: if all we have available as “elementary functions” are polynomials or, more generally, rational functions, the function $1/x$ wouldn't admit an “elementary antiderivative”, but it still would have one: $$ \int_{1}^{x}\frac{1}{t}\,dt $$ Since this is a “new” function, we give it a name, precisely “$\log$” and we have extended the tool set. The same happens with “$\operatorname{erf}$”, which has many uses in probability theory and statistics, being related to normal distributions.


This concept isn't really unique to $\int e^{-x^2}dx$.

If your "domain of knowledge" is just the rational numbers, and you were asked "what number, when squared, gives you, 2?", and after not knowing the answer you were told it was $\sqrt{2}$, the number that when squared, yields $2$, that explanation would seem circular.

$\operatorname{erf}$ is just some function. The fact that it can't be expressed in terms of "simpler" operations isn't much stranger than the fact that the function $x \mapsto \sqrt{x}$ can't be expressed in terms of "simpler" operations.


The integral you want doesn't have a nice antiderivative in terms of familiar functions. However, it's an important antiderivative, so mathematicians gave it a name.