Ideas of finding counterexamples?

The questions are from an exercise in Gibert Strang's Linear Algebra.

Construct $2$ by $2$ matrices such that the eigenvalues of $AB$ are not the products of the eigenvalues of $A$ and $B$, and the eigenvalues of $A+B$ are not the sums of the individual eigenvalues.

It's obvious that either $A$ or $B$ is NOT diagonal matrix.

Here is my question:

How should I approach the construction? Any heuristics?


Added:

This seems to be a rather "stupid" question. Trial and error with MATLAB may lead to the results. However, I'd like to go a little bit further. "Trial and error" can be viewed as one kind of method for counterexample finding. My second question may be more "stupid" and very vague:

Can any one come up with a "general idea" for the construction of counterexamples in mathematics?


Finding counterexamples is something of a dark art; I have seen literally no mathematical writing of any kind which explicitly discusses how one might go about doing it, even though it is quite an important mathematical skill. Here are some thoughts off the top of my head.

Don't underestimate trial and error. It's generally easy to do, and it is often very enlightening. You don't have an excuse not to do it, especially if you have a CAS. (Conceptual explanations are overrated in situations like this. Sometimes a statement is false just because it has no reason to be true, and if your intuition tells you otherwise, update your intuition.)

Sometimes blindly constructing a counterexample works. In certain situations, you can look at the list of properties a counterexample needs to satisfy and just write down an example which satisfies the first property, modify it to satisfy the second, etc. and there will be essentially no obstruction to doing this. This is also generally easy to do, when it works.

Become familiar with small examples, and know ways of constructing large ones. (This is extraordinarily important, and it seems to me that relatively few students ever bother to do it.) Sometimes trial and error is difficult because it's unclear how to construct a random example to check. For matrices this is fairly straightforward, but for other types of objects (such as groups) it may be less so.

So if you want to find a counterexample in group theory, for example, you should become familiar with the classification of groups of small order, and you should also know ways of constructing larger, complicated groups (such as $S_n, \text{GL}_n(\mathbb{F}_p)$, semidirect products, etc.).

Make sure your examples somehow exhaust the complexity of the objects in question: looking at abelian groups to construct a counterexample to a statement about groups could work, but if it doesn't you should quickly start looking further.

Try to prove that no counterexamples exist. This can often be helpful. If there's some assumption you seem to need to make a simple proof work, look at examples that violate that assumption. If you don't even know where to start, go back to trial and error until you have a better feeling for how the objects in question behave.

(Advanced:) Become familiar with standard counterexamples. Some fields of mathematics, such as general topology, have well-known counterexamples that would be difficult to come up with on your own, so most of the time you're better off learning what they are than trying to come up with your own. Standard counterexamples, or the ideas behind them, can often be modified or combined to give counterexamples to other statements.


Less philosophically, I think Strang's exercise is not a "well-posed" one in that it seems to assume that there should be some sort of "canonical" order in the set of eigenvalues of a matrix, doesn't it?

I mean, let's say that $\lambda_X, \mu_X$ are the eigenvalues of $X$ for $X$ a $2\times 2$ matrix. Then, what's the meaning of the question for the product? For instance:

$$ \lambda_{AB} \neq \lambda_A \lambda_B \ ,\qquad \text{or} \qquad \lambda_{AB} \neq \lambda_A \mu_B \ , \qquad\text{or both?} $$

Anyway, assuming that the question has some sense, then the next step could be the following: $A$ and $B$ cannot be diagonal matrices as you say, right. But they can be diagonalizable, as long as they're so in different bases. Otherwise, the eigenvalues of $AB$ and $A + B$ would certainly still be "the" products and "the" sums, of those of $A$ and $B$.

So, one could start with an easy matrix such as

$$ A = \begin{pmatrix} 1 & 0 \\ 0 & -1 \end{pmatrix} $$

with obvious eigenvalues $1,-1$, and make a basis change with it. For instance, with the basis $\left\{ (1,1), (1,-1) \right\}$. You would obtain

$$ B= \begin{pmatrix} 0 & 1 \\ 1 & 0 \end{pmatrix} $$

with the same eigenvalues as $A$. Now, if you compute

$$ AB = \begin{pmatrix} 0 & 1 \\ -1 & 0 \end{pmatrix} \qquad \text{and} \qquad A + B = \begin{pmatrix} 1 & 1 \\ 1 & -1 \end{pmatrix} \ . $$

The eigenvalues of $AB$ are $i,-i$, and those of $A+B$ are $\sqrt{2}, -\sqrt{2}$, which are certainly not, respectively, "the" products, or "the" sums of those of $A$ and $B$.