Intuition on Harris recurrence
Solution 1:
This answer may be wrong, but I think it is worth posting and if it is wrong, someone can point it out and I can learn something too.
I think you do not mean a finite Markov Chain, because for a finite state chain, assuming it is irreducible, every state will be visited infinitely often, there is no question.
I think there is only a difference if the state space is uncountable.
This is because the event $V_i=\infty$ is the same as the event "state i is visited infinitely often". This has probability 1 or 0, by Levy's zero-one law.
So, suppose a positive recurrent chain is not Harris recurrent. This means the expected number of visits to $i$ is infinite, but the number visits to $i$ is finite, almost surely, but doesn't this mean it is transient?