What are differences and relationship between shannon entropy and fisher information?
Fisher information is related to the asymptotic variability of a maximum likelihood estimator. The idea being that higher Fisher Information is associated with lower estimation error.
Shannon Information is totally different, and refers to the content of the message or distribution, not its variability. Higher entropy distributions are assumed to convey less information because they can be transmitted with fewer bits.
However, there is a relationship between Fisher Information and Relative Entropy/KL Divergence, as discussed on Wikipedia.