Comparing Differential Entropy of Normal Distribution to Shannon Entropy

The result for the differential entropy of the multivariate normal distribution simply arises from its definition:

$$h(X)= - \int f(X) \log(f(X))dX $$

are they related or comparable at all?

They are not comparable. The Shannon entropy of a continuous variable is infinite.

Think of the case of a uniform rv on $[0,a]$. Its differential entropy is $h(X)=\log(a)$. The Shannon entropy, for a discrete uniform is $H(X) = \log(N)=\log(a / \Delta x)= \log(a) - \log(\Delta x)$ Then $H(X) \to +\infty$ as $\Delta x \to 0$. (This agrees with the concept of the Shannon entropy as average information: the information you can code in a real number over any interval is infinite).

Basically the same relationship exists for the Gaussian or any well behaved continuous density.

At most, you can think the differential entropy as the "residual" of the Shannon entropy, what is left apart from the (tending to infinite) discretization term.