Entropy of a uniform distribution
Solution 1:
Continuous entropy doesn't have quite the same meaning as discrete entropy. For example, we could also take $a = 0$ and $b = 1/2$, giving entropy $-\ln(2) < 0$, where as in the discrete case entropy is always non-negative. Note that a lot of the difference comes from the fact that a probability density function (pdf) can be greater than one, on a set of measure (size) less than 1, though, so that the integral is 1.
Check out the WolframAlpha entry on it: Differential Entropy. Also, here is the Wikipedia entry on it: Differential Entropy.
Compare this with the discrete distribution: Suppose we have $P(X = x_n) = 1/N$ where X takes the values $\{ x_1, ..., x_N \}$. This gives entropy $$H(X) = -\sum_{n=1}^N P(X=X_n) \log_2 P(X = X_n) = -\sum_{n=1}^N {1 \over N} \log_2 {1 \over N} = N \cdot {1 \over N} \log_2 N = \log_2 N.$$ Note that this is actually the maximal value for the entropy - this can be shown using Gibbs' inequality, or just by finding the maximum of the function $f(x) = -x \ln x$ (eg by differentiating and solving $f'(x) = 0$), and observing that $$\log_2 x = {\ln x \over \ln 2}.$$
Hope this helps! If it does, remember to upvote! ;)