Data Processing Inequality for Random Vectors forming a Markov Chain

Solution 1:

The definitions of entropy, mutual information, etc for discrete random variables are not specific to one-dimensional (scalar) quantities. They apply also to multi-dimensional variables, with no changes. Hence it's true that $$ \begin{align} \mathbf{X} \rightarrow \mathbf{Y} \rightarrow \mathbf{Z} \implies I(\mathbf{X};\mathbf{Z}) \leq I(\mathbf{X};\mathbf{Y}) \end{align} $$