New posts in information-theory

Which are good books for applications of Shannon Information Theory?

Measure of how much information is lost in an implication

Confidence Interval of Information Entropy?

Probabilistic Proof of Kraft-Mcmillan Inequality

Prove that bitstrings with 1/0-ratio different from 50/50 are compressable

How much information can you transfer by choosing one number out of two?

Why is the entropy of a posterior Gaussian distribution higher than its prior?

What is the connectivity between Boltzmann's entropy expression and Shannon's entropy expression?

Lower bound on binomial coefficient

What is the least amount of questions to find out the number that a person is thinking between 1 to 1000 when they are allowed to lie at most once

How do I compute the approximate entropy of a bit string?

Data Processing Inequality for Random Vectors forming a Markov Chain

Error correction code handling deletions and insertions

Mutual information vs Information Gain

What is the relationship of $\mathcal{L}_1$ (total variation) distance to hypothesis testing?

Relation between Shannon Entropy and Total Variation distance

Searching radioactive balls

Understanding the relationship of the $L^1$ norm to the total variation distance of probability measures, and the variance bound on it

An entropy inequality

Is entropy of a binomial distribution an increasing function of $n$?