What are the main uses of convex functions?
Convex functions are particularly easy to minimise (for example, any minimum of a convex function is a global minimum). For this reason, there is a very rich theory for solving convex optimisation problems that has many practical applications (for example, circuit design, controller design, modelling, etc.), see here and here.
I can't remember who's quote this is, or exactly what the quote was, but it basically stated that (EDIT: thanks to Rahul Narain for providing the correct quote now included below):
"In fact the great watershed in optimization isn’t between linearity and nonlinearity, but convexity and nonconvexity." R.T. Rockafellar, "Lagrange Multipliers and Optimality", SIAM Review, 1993.
Aside: I first heard the above paraphrased in a lecture where the speaker (I can't remember who) also mentioned that researchers in the Soviet Union caught onto the above much sooner than western researchers (I can't vouch personally whether this is true or not).
Besides optimization problems (which have already been mentioned in other answers), one common way to make use of convexity is via Jensen's inequality, which has the following nice and concise statement:
"If $f$ is a convex function applied to a bunch of values $x_i$ and their mean $\bar x$, then the mean of the $f$s is greater than or equal to $f$ of the mean. Also, if $f$ is strictly convex, the inequality is strict unless all the $x_i$'s are equal."
The reason this is so useful is that it turns out that, in probability theory, one quite often ends up comparing the averages of some set of values before and after applying a function, and the function quite often turns out, or can be arranged, to be (at least locally) convex.
For example, since the standard deviation of a distribution is the square root of its variance (which in turn is the mean of the squared distances from the mean), and since the square root function is concave, it follows from Jensen's inequality that the sample variance, being an unbiased estimator of the true variance, yields an underestimate of the standard deviation.
For another example, the arithmetic–geometric mean inequality follows directly from Jensen's inequality and the observation that the exponential function is convex (or, equivalently, that the logarithm is concave).
In fact, Jensen's inequality continues to hold even if the mean is replaced by any weighted mean (or, in other words, by any convex combination) of the values, provided that the values are weighted the same way before and after applying $f$.
This fact also has a nice geometric interpretation: if a set of points $(x_i,\,y_i)$ lie on the graph $y=f(x)$ of a convex function $f$, then their convex hull lies entirely above the graph and (assuming $f$ is strictly convex) intersects it only at the points $(x_i,\,y_i)$ themselves:
I at least find this connection between geometry and probability theory very elegant, not to mention often handy.
Convex functions have numerous applications. They are used for proving some inequalities in easy manner. They also have many applications in Operation Research, Quadratic and Geometric programming problems, etc.