Newbetuts
.
New posts in information-theory
Which are good books for applications of Shannon Information Theory?
probability
information-theory
signal-processing
applications
neural-networks
Measure of how much information is lost in an implication
logic
soft-question
information-theory
philosophy
kolmogorov-complexity
Confidence Interval of Information Entropy?
probability
statistics
probability-theory
information-theory
Probabilistic Proof of Kraft-Mcmillan Inequality
general-topology
information-theory
coding-theory
probabilistic-method
Prove that bitstrings with 1/0-ratio different from 50/50 are compressable
cryptography
information-theory
entropy
kolmogorov-complexity
How much information can you transfer by choosing one number out of two?
game-theory
information-theory
coding-theory
Why is the entropy of a posterior Gaussian distribution higher than its prior?
information-theory
entropy
What is the connectivity between Boltzmann's entropy expression and Shannon's entropy expression?
information-theory
statistical-mechanics
Lower bound on binomial coefficient
inequality
binomial-coefficients
information-theory
What is the least amount of questions to find out the number that a person is thinking between 1 to 1000 when they are allowed to lie at most once
puzzle
information-theory
How do I compute the approximate entropy of a bit string?
encryption
entropy
information-theory
data-compression
Data Processing Inequality for Random Vectors forming a Markov Chain
markov-chains
information-theory
entropy
mutual-information
Error correction code handling deletions and insertions
discrete-mathematics
finite-fields
information-theory
coding-theory
Mutual information vs Information Gain
information-theory
entropy
What is the relationship of $\mathcal{L}_1$ (total variation) distance to hypothesis testing?
statistics
probability-theory
information-theory
decision-theory
Relation between Shannon Entropy and Total Variation distance
probability-theory
measure-theory
information-theory
entropy
total-variation
Searching radioactive balls
combinatorics
optimization
algorithms
information-theory
extremal-combinatorics
Understanding the relationship of the $L^1$ norm to the total variation distance of probability measures, and the variance bound on it
probability-theory
inequality
information-theory
An entropy inequality
real-analysis
convex-analysis
information-theory
Is entropy of a binomial distribution an increasing function of $n$?
probability
probability-distributions
random-variables
information-theory
entropy
Prev
Next