what is the mutual information of three variables?

That's a good guess, but the mutual information of three variables is (following the notations used in the question) $$I(X;Y;Z)=\sum_x\sum_y\sum_z p(x,y,z)\ln\frac{p(x,y)p(x,z)p(y,z)}{p(x,y,z)p(x)p(y)p(z)}$$ With the natural logarithm, this would be in nats, of course. This is in accord with the definition $I(X;Y;Z)=I(X;Y)-I(X,Y|Z)$. For more than three variables, the expression is similar: all joint/marginal probabilities appear in the fraction, those of an even number of variables in the numerator, and those of an odd number of variables in the denominator. Multivariate mutual information is symmetric in all its arguments, but unlike the mutual information of two random variables, it can be positive, negative, or zero. Apparently no one knows what it really means, though.


The general concept is called multivariate mutual information, but I believe that hardly anybody knows what it actually means and how it can be used. Note that the multivariate mutual information can become negative. For three variables it is defined as

$$I(X;Y;Z)=I(X;Y)-I(X;Y|Z)$$

where $I(X;Y|Z)$ is the conditional mutual information of $X$ and $Y$ given $Z$.