What's the probability that a sequence of coin flips never has twice as many heads as tails?
The game stops with probability $u=\frac34(3-\sqrt5)=0.572949017$.
See the end of the post for generalizations of this result, first to every asymmetric heads-or-tails games (Edit 1), and then to every integer ratio (Edit 2).
To prove this, consider the random walk which goes two steps to the right each time a tail occurs and one step to the left each time a head occurs. Then the number of tails is the double of the number of heads each time the walk is back at its starting point (and only then). In other words, the probability that the game never stops is $1-u$ where $u=P_0(\text{hits}\ 0)$ for the random walk with equiprobable steps $+2$ and $-1$.
The classical one-step analysis of hitting times for Markov chains yields $2u=v_1+w_2$ where, for every positive $k$, $v_k=P_{-k}(\text{hits}\ 0)$ and $w_k=P_{k}(\text{hits}\ 0)$. We first evaluate $w_2$ then $v_1$.
The $(w_k)$ part is easy: the only steps to the left are $-1$ steps hence to hit $0$ starting from $k\ge1$, the walk must first hit $k-1$ starting from $k$, then hit $k-2$ starting from $k-1$, and so on. These $k$ events are equiprobable hence $w_k=(w_1)^k$. Another one-step analysis, this time for the walk starting from $1$, yields $$ 2w_1=1+w_3=1+(w_1)^3 $$ hence $w_k=w^k$ where $w$ solves $w^3-2w+1=0$. Since $w\ne1$, $w^2+w=1$ and since $w<1$, $w=\frac12(\sqrt5-1)$.
Let us consider the $(v_k)$ part. The random walk has a drift to the right hence its position converges to $+\infty$ almost surely. Let $k+R$ denote the first position visited on the right of the starting point $k$. Then $R\in\{1,2\}$ almost surely, the distribution of $R$ does not depend on $k$ because the dynamics is invariant by translations, and $$ v_1=r+(1-r)w_1\quad\text{where}\ r=P_{-1}(R=1). $$ Now, starting from $0$, $R=1$ implies that the first step is $-1$ hence $2r=P_{-1}(A)$ with $A=[\text{hits}\ 1 \text{before}\ 2]$. Consider $R'$ for the random walk starting at $-1$. If $R'=2$, $A$ occurs. If $R'=1$, the walk is back at position $0$ hence $A$ occurs with probability $r$. In other words, $2r=(1-r)+r^2$, that is, $r^2-3r+1=0$. Since $r<1$, $r=\frac12(3-\sqrt5)$ (hence $r=1-w$).
Plugging these values of $w$ and $r$ into $v_1$ and $w_2$ yields the value of $u$.
Edit 1 Every asymmetric random walk which performs elementary steps $+2$ with probability $p$ and $-1$ with probability $1-p$ is transient to $+\infty$ as long as $p>\frac13$ (and naturally, for every $p\le\frac13$ the walk hits $0$ with full probability). In this regime, one can compute the probability $u(p)$ to hit $0$. The result is the following.
For every $p$ in $(0,\frac13)$, $u(p)=\frac32\left(2-p-\sqrt{p(4-3p)}\right).$
Note that $u(p)\to1$ when $p\to\frac13$ and $u(p)\to0$ when $p\to1$, as was to be expected.
Edit 2 Coming back to symmetric heads-or-tails games, note that, for any fixed integer $N\ge2$, the same techniques apply to compute the probability $u_N$ to reach $N$ times more tails than heads.
One gets $2u_N=E(w^{R_N-1})+w^N$ where $w$ is the unique solution in $(0,1)$ of the polynomial equation $2w=1+w^{1+N}$, and the random variable $R_N$ is almost surely in $\{1,2,\ldots,N\}$. The distribution of $R_N$ is characterized by its generating function, which solves $$ (1-(2-r_N)s)E(s^{R_N})=r_Ns-s^{N+1}\quad\text{with}\quad r_N=P(R_N=1). $$ This is equivalent to a system of $N$ equations with unknowns the probabilities $P(R_N=k)$ for $k$ in $\{1,2,\ldots,N\}$. One can deduce from this system that $r_N$ is the unique root $r<1$ of the polynomial $(2-r)^Nr=1$. One can then note that $r_N=w^N$ and that $E(w^{R_N})=\dfrac{Nr_N}{2-r_N}$ hence some further simplifications yield finally the following general result.
For every $N\ge2$, $u_N=\frac12(N+1)r_N$ where $r_N<1$ solves the equation $(2-r)^Nr=1$.
(Update: The answer to the original question is that the probability of stopping is $\frac{3}{4} \left(3 - \sqrt{5}\right)$. See end of post for an infinite series expression in the general case.)
Let $S(n)$ denote the number of ways to stop after seeing $n$ tails. Seeing $n$ tails means seeing $2n$ heads, so this would be stopping after $3n$ flips. Since there are $2^{3n}$ possible sequences in $3n$ flips, the probability of stopping is $\sum_{n=1}^{\infty} S(n)/8^n$.
To determine $S(n)$, we see that there are $\binom{3n}{n}$ ways to choose which $n$ of $3n$ flips will be tails. However, this overcounts for $n > 1$, as we could have seen twice as many heads as tails for some $k < n$. Of these $\binom{3n}{n}$ sequences, there are $S(k) \binom{3n-3k}{n-k}$ sequences of $3n$ flips in which there are $k$ tails the first time we would see twice as many heads as tails, as any of the $S(k)$ sequences of $3k$ flips could be completed by choosing $n-k$ of the remaining $3n-3k$ flips to be tails. Thus $S(n)$ satisfies the recurrence $S(n) = \binom{3n}{n} - \sum_{k=1}^{n-1} \binom{3n-3k}{n-k}S(k)$, with $S(1) = 3$.
The solution to this recurrence is $S(n) = \frac{2}{3n-1} \binom{3n}{n}.$ This can be verified easily, as substituting this expression into the recurrence yields a slight variation on Identity 5.62 in Concrete Mathematics (p. 202, 2nd ed.), namely, $$\sum_k \binom{tk+r}{k} \binom{tn-tk+s}{n-k} \frac{r}{tk+r} = \binom{tn+r+s}{n},$$ with $t = 3$, $r = -1$, $s=0$.
So the probability of stopping is $$\sum_{n=1}^{\infty} \binom{3n}{n} \frac{2}{3n-1} \frac{1}{8^n}.$$
Mathematica gives the closed form for this probability of stopping to be $$2 \left(1 - \cos\left(\frac{2}{3} \arcsin \frac{3 \sqrt{3/2}}{4}\right) \right) \approx 0.572949.$$
Added: The sum is hypergeometric and has a simpler representation. See Sasha's comments for why the sum yields this closed form solution and also why the answer is $$\frac{3}{4} \left(3 - \sqrt{5}\right) \approx 0.572949.$$
Added 2: This answer is generalizable to other ratios $r$ up to the infinite series expression. For the general $r \geq 2$ case, the argument above is easily adapted to produce the recurrence $S(n) = \binom{(r+1)n}{n} - \sum_{k=1}^{n-1} \binom{(r+1)n-(r+1)k}{n-k}S(k)$, with $S(1) = r+1$. The solution to the recurrence is $S(n) = \frac{r}{(r+1) n - 1} \binom{(r+1) n}{n}$ and can be verified easily by using the binomial convolution formula given above. Thus, for the ratio $r$, the probability of stopping has the infinite series expression $$\sum_{n=1}^{\infty} \binom{(r+1)n}{n} \frac{r}{(r+1)n-1} \frac{1}{2^{(r+1)n}}.$$ This can be expressed as a hypergeometric function, but I am not sure how to simplify it any further for general $r$ (and neither does Mathematica). It can also be expressed using the generalized binomial series discussed in Concrete Mathematics (p. 200, 2nd ed.), but I don't see how to simplify it further in that direction, either.
Added 3: In case anyone is interested, I found a combinatorial proof of the formula for $S(n)$. It works in the general $r$ case, too.