Can the entropy of a random variable with countably many outcomes be infinite?
Consider a random variable $X$ taking values over $\mathbb{N}$. Let $\mathbb{P}(X = i) = p_i$ for $i \in \mathbb{N}$. The entropy of $X$ is defined by $$H(X) = \sum_i -p_i \log p_i.$$ Is it possible for $H(X)$ to be infinite?
Consider the independent variables $ X_1, X_2 , X_3 \cdots$, where $X_k$ have non overlapping discrete uniform distributions over $2^k$ values: $X_1 \sim U[2\ldotp\ldotp3]$, $X_2\sim U[4\ldotp\ldotp7]$, $X_3 \sim U[8\ldotp\ldotp15]$, etc. The respective entropies are $H(X_k) = k$ bits.
Let $Y =X_M$ where $M$ is a random variable (independent of $X_k$) with pmf $m_k$ ($\sum_{k=1}^\infty m_k=1)$. This is known as a mixture, with mixing distribution $m_k$, and the resulting entropy (because the components do not overlap) is
$$\begin{array}{} H(Y)&=&H(M)+m_1 H(X_1)+ m_2 H(X_2) + m_3 H(X_3) +\cdots \\ &=& H(M) + m_1 + 2 \, m_2 + 3 \, m_3 \cdots \\&=& H(M) + \sum_{k=1}^\infty \, k \, m_k = H(M) + E(M) \end{array}$$
Hence, by chosing any mixing distribution $m_k$ with infinite mean ($E(M)=\sum k \, m_k=+\infty$), we attain infinite entropy. For example: $m_k=A/k^2$ (the example in David's answer essentially corresponds to this choice). Or, similarly, $m_k = \frac{1}{k}-\frac{1}{k+1}=\frac{1}{k^2+k}$
Yes, it can.
In this pdf, it is shown that the following random variable $X$ has infinite entropy:
Let $X$ take the values $2,3,\ldots$ and define $$p_n={1\over A n\log^2_2 n},\ \ \ n\ge2,$$ where $A = \sum\limits_{n=2}^\infty {1\over n\log^2_2 n}$ ($A$ is a convergent sum by the Integral Test). Then $X$ with the probability mass function $P[X=n]=p_n$, $n\ge 2$, has infinite entropy.
For completeness, I'll reproduce the argument contained in the link above that $H(X)=\infty$ here:
We have $$\eqalign{ H(X)&=-\sum_{n=2} p_n \log_2 p_n \cr &=\sum_{n=2}^\infty\bigl( -\log_2(p_n)\bigr) p_n\cr &= \sum_{n=2}{\log_2( An \log_2^2 n )\over A n\log_2^2 n} \cr &=\sum_{n=2}^\infty {{\log_2 A+\log_2 n +2\log_2(\log_2 n))}\over A n \log_2^2 n}\cr &=\sum_{n=2}^\infty\biggl[\ \color{maroon}{\log_2 A\over A n\log_2^2 n}+\color{darkgreen}{1\over A n\log_2 n}+\color{darkblue}{2\log_2(\log_2 n)) \over A n\log_2^2 n}\ \biggr].\cr } $$
Now
$$
\color{maroon}{
\sum\limits_{n=2}^\infty {\log_2 A\over A n\log_2^2 n}}
={\log_2 A\over A}
\sum\limits_{n=2}^\infty {1\over n\log_2^2 n} ={\log_2 A\over A}\cdot A=\log_2 A.
$$
and $$ \color{darkblue}{ \sum\limits_{n=2}^\infty {2\log_2(\log_2 n)\over A n\log_2^2 n}} $$ is a sum consisting of nonnegative terms.
It will follow that $H(X)=\infty$, if we can show that the sum $$ \color{darkgreen}{\sum_{n=2}^\infty {1\over An\log_2 n}} $$ diverges to $\infty$. But, this follows easily from the Integral Test.
(From the Integral Test, it follows that $\int_2^\infty {dx\over x\log_2^p x}$
converges if and only if $p>1$.)