As you can see by reading the answers to this question involving a knight tour on a chessboard, we can determine the expected return time for any irreducible Markov chain by finding the unique stationary distribution; if the mass of the distribution at a given state is $p$, then the expected return time to that state is $\frac{1}{p}$. (Intuitively, if we process the Markov chain for a long time, we expect to be at the given state with probability $p$, so the average distance between returns to the state is $\frac{1}{p}$.)

In the case of a random walk on a graph $G$, the unique stationary distribution is given by making the mass at each vertex proportional to its degree. When the graph is regular (as is the case here), each vertex will have the same mass, namely $\frac{1}{|G|}$. So the expected return time for each vertex will be exactly $|G|$.

Thus the expected number of turns required to get back to the starting state of any type of Rubik's cube is equal to the number of positions of the cube.


Consider a simpler scenario: suppose you have a biased coin, which shows Heads with probability $p > 0$. What is the expected number of attempts required to throw a Head? The probability that exactly $n$ throws are required is $q^{n-1}p$, where $q = 1-p$. So the expected number of throws is $\sum_{n=1}^\infty nq^{n-1}p = \dfrac{p}{(1-q)^2} = \dfrac{1}{p}$. Note in particular that this is finite!

Now, let $N$ be the maximum number of moves required to solve an $n \times n \times n$ Rubik's cube. Thanks to the efforts of countless dedicated investigators, we know that for $n=3$, we have $N=20$ according to the half-turn metric, and $N=26$ according to the quarter-turn metric (see for instance this link). But the exact value of $N$ doesn't matter; the number of positions is finite, and therefore the maximum number of moves required to solve a solvable position is also finite.

Suppose the number of moves available at each turn is $m$. You say that $m=6n$, and I won't argue with that; again, the important thing is that $m$ is finite. Then all you have to do to solve a position in $N$ moves or less is to pick the optimal move each time. The probability of doing this by chance is at least $p=m^{-N}$.

So if you consider a "coin flip" to be $N$ random moves in your Rubik's cube, then the expected number of coin flips is at most $\dfrac{1}{p} = m^N$. So the expected number of moves is at most $\dfrac{N}{p} = Nm^N$.