Solutions to the matrix equation $\mathbf{AB-BA=I}$ over general fields
Solution 1:
Let $k$ be a field. The first Weyl algebra $A_1(k)$ is the free associative $k$-algebra generated by two letters $x$ and $y$ subject to the relation $$xy-yx=1,$$ which is usually called the Heisenberg or Weyl commutation relation. This is an extremely important example of a non-commutative ring which appears in many places, from the algebraic theory of differential operators to quantum physics (the equation above is Heisenberg's indeterminacy principle, in a sense) to the pinnacles of Lie theory to combinatorics to pretty much anything else.
For us right now, this algebra shows up because
an $A_1(k)$-modules are essentially the same thing as solutions to the equation $PQ-QP=I$ with $P$ and $Q$ endomorphisms of a vector space.
Indeed:
if $M$ is a left $A_1(k)$-module then $M$ is in particular a $k$-vector space and there is an homomorphism of $k$-algebras $\phi_M:A_1(k)\to\hom_k(M,M)$ to the endomorphism algebra of $M$ viewed as a vector space. Since $x$ and $y$ generate the algebra $A_1(k)$, $\phi_M$ is completely determined by the two endomorphisms $P=\phi_M(x)$ and $Q=\phi_M(y)$; moreover, since $\phi_M$ is an algebra homomorphism, we have $PQ-QP=\phi_1(xy-yx)=\phi_1(1_{A_1(k)})=\mathrm{id}_M$. We thus see that $P$ and $Q$ are endomorphisms of the vector space $M$ which satisfy our desired relation.
Conversely, if $M$ is a vector space and $P$, $Q:M\to M$ are two linear endomorphisms, then one can show more or less automatically that there is a unique algebra morphism $\phi_M:A_1(k)\to\hom_k(M,M)$ such that $\phi_M(x)=P$ and $\phi_M(y)=Q$. This homomorphism turns $M$ into a left $A_1(k)$-module.
These two constructions, one going from an $A_1(k)$-module to a pair $(P,Q)$ of endomorphisms of a vector space $M$ such that $PQ-QP=\mathrm{id}_M$, and the other going the other way, are mutually inverse.
A conclusion we get from this is that your question
for what fields $k$ do there exist $n\geq1$ and matrices $A$, $B\in M_n(k)$ such that $AB-BA=I$?
is essentially equivalent to
for what fields $k$ does $A_1(k)$ have finite dimensional modules?
Now, it is very easy to see that $A_1(k)$ is an infinite dimensional algebra, and that in fact the set $\{x^iy^j:i,j\geq0\}$ of monomials is a $k$-basis. Two of the key properties of $A_1(k)$ are the following:
Theorem. If $k$ is a field of characteristic zero, then $A_1(k)$ is a simple algebra—that is, $A_1(k)$ does not have any non-zero proper bilateral ideals. Its center is trivial: it is simply the $1$-dimensional subspace spanned by the unit element.
An immediate corollary of this is the following
Proposition. If $k$ is a field of characteristic zero, the $A_1(k)$ does not have any non-zero finite dimensional modules. Equivalently, there do not exist $n\geq1$ and a pair of matrices $P$, $Q\in M_n(k)$ such that $PQ-QP=I$.
Proof. Suppose $M$ is a finite dimensional $A_1(k)$-module. Then we have an algebra homomorphism $\phi:A_1(k)\to\hom_k(M,M)$ such that $\phi(a)(m)=am$ for all $a\in A_1(k)$ and all $m\in M$. Since $A_1(k)$ is infinite dimensional and $\hom_k(M,M)$ is finite dimensional (because $M$ is finite dimensional!) the kernel $I=\ker\phi$ cannot be zero —in fact, it must hace finite codimension. Now $I$ is a bilateral ideal, so the theorem implies that it must be equal to $A_1(k)$. But then $M$ must be zero dimensional, for $1\in A_1(k)$ acts on it at the same time as the identity and as zero. $\Box$
This proposition can also be proved by taking traces, as everyone else has observed on this page, but the fact that $A_1(k)$ is simple is an immensely more powerful piece of knowledge (there are examples of algebras which do not have finite dimensional modules and which are not simple, by the way :) )
Now let us suppose that $k$ is of characteristic $p>0$. What changes in term of the algebra? The most significant change is
Observation. The algebra $A_1(k)$ is not simple. Its center $Z$ is generated by the elements $x^p$ and $y^p$, which are algebraically independent, so that $Z$ is in fact isomorphic to a polynomial ring in two variables. We can write $Z=k[x^p,y^p]$.
In fact, once we notice that $x^p$ and $y^p$ are central elements —and this is proved by a straightforward computation— it is easy to write down non-trivial bilateral ideals. For example, $(x^p)$ works; the key point in showing this is the fact that since $x^p$ is central, the left ideal which it generates coincides with the bilateral ideal, and it is very easy to see that the left ideal is proper and non-zero.
Moreover, a little playing with this will give us the following. Not only does $A_1(k)$ have bilateral ideals: it has bilateral ideals of finite codimension. For example, the ideal $(x^p,y^p)$ is easily seen to have codimension $p^2$; more generally, we can pick two scalars $a$, $b\in k$ and consider the ideal $I_{a,b}=(x^p-a,y^p-b)$, which has the same codimension $p^2$. Now this got rid of the obstruction to finding finite-dimensional modules that we had in the characteristic zero case, so we can hope for finite dimensional modules now!
More: this actually gives us a method to produce pairs of matrices satisfying the Heisenberg relation. We just can pick a proper bilateral ideal $I\subseteq A_1(k)$ of finite codimension, consider the finite dimensional $k$-algebra $B=A_1(k)/I$ and look for finitely generated $B$-modules: every such module is provides us with a finite dimensional $A_1(k)$-module and the observations above produce from it pairs of matrices which are related in the way we want.
So let us do this explicitly in the simplest case: let us suppose that $k$ is algebraically closed, let $a$, $b\in k$ and let $I=I_{a,b}=(x^p-a,y^p-b)$. The algebra $B=A_1(k)/I$ has dimension $p^2$, with $\{x^iy^j:0\leq i,j<p\}$ as a basis. The exact same proof that the Weyl algebra is simple when the ground field is of characteristic zero proves that $B$ is simple, and in the same way the same proof that proves that the center of the Weyl algebra is trivial in characteristic zero shows that the center of $B$ is $k$; going from $A_1(k)$ to $B$ we have modded out the obstruction to carrying out these proofs. In other words, the algebra $B$ is what's called a (finite dimensional) central simple algebra. Wedderburn's theorem now implies that in fact $B\cong M_p(k)$, as this is the only semisimple algebra of dimension $p^2$ with trivial center. A consequence of this is that there is a unique (up to isomorphism) simple $B$-module $S$, of dimension $p$, and that all other finite dimensional $B$-modules are direct sums of copies of $S$.
Now, since $k$ is algebraically closed (much less would suffice) there is an $\alpha\in k$ such that $\alpha^p=a$. Let $V=k^p$ and consider the $p\times p$-matrices $$Q=\begin{pmatrix}0&&&&b\\1&0\\&1&0\\&&1&0\\&&&\ddots&\ddots\end{pmatrix}$$ which is all zeroes expect for $1$s in the first subdiagonal and a $b$ on the top right corner, and $$P=\begin{pmatrix}-\alpha&1\\&-\alpha&2\\&&-\alpha&3\\&&&\ddots&\ddots\\&&&&-\alpha&p-1\\&&&&&-\alpha\end{pmatrix}.$$ One can show that $P^p=aI$, $Q^p=bI$ and that $PQ-QP=I$, so they provide us us a morphism of algebras $B\to\hom_k(k^ p,k^ p)$, that is, they turn $k^p$ into a $B$-module. It must be isomorphic to $S$, because the two have the same dimension and there is only one module of that dimension; this determines all finite dimensional modules, which are direct sums of copies of $S$, as we said above..
This generalizes the example Henning gave, and in fact one can show that this procedure gives all $p$-dimensional $A_1(k)$-modules can be constructed from quotients by ideals of the form $I_{a,b}$. Doing direct sums for various choices of $a$ and $b$, this gives us lots of finite dimensional $A_1(k)$-modules and, then, of pairs of matrices satisfying the Heisenberg relation. I think we obtain in this way all the semisimple finite dimensional $A_1(k)$-modules but I would need to think a bit before claiming it for certain.
Of course, this only deals with the simplest case. The algebra $A_1(k)$ has non-semisimple finite-dimensional quotients, which are rather complicated (and I think there are pretty of wild algebras among them...) so one can get many, many more examples of modules and of pairs of matrices.
Solution 2:
As noted in the comments, this is impossible in characteristic 0.
But $M_{2\times 2}(\mathbb F_2)$ contains the example $\pmatrix{0&1\\0&0}, \pmatrix{0&1\\1&0}$.
In general, in characteristic $p$, we can use the $p\times p$ matrices $$\pmatrix{0&1\\&0&2\\&&\ddots&\ddots\\&&&0&p-1\\&&&&0}, \pmatrix{0\\1&0\\&\ddots&\ddots\\&&1&0\\&&&1&0}$$ which works even over general unital rings of finite characteristic.
Solution 3:
In characteristic 0, we have an infinite-dimensional example over the vector space $\mathbb{F}[x]$: Let $A$ be multiplication by $x$, and $B$ be differentiation with respect to $x$.
This can be used to construct a finite-dimensional example over characteristic $p$. Let $\mathbb{F}$ be a field of characteristic $p$. Let $V$ be the $p$-dimensional space of polynomials over $\mathbb{F}$ of degree < $p$. Let $A$ be multiplication by $x$ on the elements $1, x, \ldots, x^{p-2}$ but send $x^{p-1}$ to 0. Let $B$ be differentiation with respect to $x$.
This example was constructed by noticing that multiplying $x^{p-1}$ by $x$ and then differentiating with respect to $x$ gives 0 anyway in characteristic $p$, so we can have $A$ send $x^{p-1}$ to 0 with no effect on the desired outcome. This allows us to truncate the infinite-dimensional example in characteristic 0 to a finite-dimensional example in characteristic $p$.
Edit: As Yemon Choi mentions in the comment below, this actually gives $AB-BA=-I$. To get $AB-BA=I$, either interchange $A$ and $B$ above, or (somewhat nonstandardly) interpret $A$ and $B$ as right operators instead of left operators.