Independence and conditional independence between random variables
I like to interpret these two concepts as follows:
Events $A,B$ are independent if knowing that $A$ happened would not tell you anything about whether $B$ happened (or vice versa). For instance, suppose you were considering betting some money on event $B$. Some insider comes along and offers to pass you information (for a fee) about whether or not $A$ happened. Saying $A,B$ are independent is to say that this inside information would be utterly irrelevant, and you wouldn't pay any amount of money for it.
Events $A,B$ are conditionally independent given a third event $C$ means the following: Suppose you already know that $C$ has happened. Then knowing whether $A$ happened would not convey any further information about whether $B$ happened - any relevant information that might be conveyed by $A$ is already known to you, because you know that $C$ happened.
To see independence does not imply conditional independence, one of my favorite simple counterexamples works. Flip two fair coins. Let $A$ be the event that the first coin is heads, $B$ the event that the second coin is heads, $C$ the event that the two coins are the same (both heads or both tails). Clearly $A$ and $B$ are independent, but they are not conditionally independent given $C$ - if you know that $C$ has happened, then knowing $A$ tells you a lot about $B$ (indeed, it would tell you that $B$ is guaranteed). If you want an example with random variables, consider the indicators $1_A, 1_B, 1_C$.
Of interest here is that $A,B,C$ are pairwise independent but not mutually independent (since any two determine the third).
A nice counterexample in the other direction is the following. We have a bag containing two identical-looking coins. One of them (coin #1) is biased so that it comes up heads 99% of the time, and coin #2 comes up tails 99% of the time. We will draw a coin from the bag at random, and then flip it twice. Let $A$ be the event that the first flip is heads, $B$ the event that the second flip is heads. These are clearly not independent: you can do a calculation if you like, but the idea is that if the first flip was heads, it is strong evidence that you drew coin #1, and therefore the second flip is far more likely to be heads.
But let $C$ be the event that coin #1 was drawn (where $P(C)=1/2$). Now $A$ and $B$ are conditionally independent given $C$: if you know that $C$ happened, then you are just doing an experiment where you take a 99%-heads coin and flip it twice. Whether the first flip is heads or tails, the coin has no memory so the probability of the second flip being heads is still 99%. So if you already know which coin you have, then knowing how the first flip came out is of no further help in predicting the second flip.
Independence does not imply conditional independence: for instance, independent random variables are rarely independent conditionally on their sum or on their maximum.
Conditional independence does not imply independence: for instance, conditionally independent random variables uniform on $(0,u)$ where $u$ is uniform on $(0,1)$ are not independent.
If $(X,Y,Z)$ are (mutually) independent then $X$ is independent of $Y$ conditional on $Z$. Proof:
$$ f_{x,y|z}=f_{x,y,z}/f_z=f_{x}f_{y}f_z/f_z=f_xf_y=f_{x|z}f_{y|z}. $$