Eigenvalues of linear operator $F(A) = AB + BA$
Solution 1:
In this solution, we only assume that $B$ is diagonalizable (i.e., the eigenvalues $\lambda_i$'s need not be distinct). If $v_1,v_2,\ldots,v_n$ are the eigenvectors of $B$ and $w_1,w_2,\ldots,w_n$ are the left eigenvectors of $B$, where $Bv_i=\lambda_iv_i$ and $w_i^\top B=\lambda_i w_i^\top$ for $i=1,2,\ldots,n$. Then, $$ \begin{align} F\left(v_i w_j^\top\right)&= \left(v_iw_j^\top\right)B+B\left(v_i w_j^\top\right)=v_i\left(w_j^\top B\right)+\left(Bv_i\right)w_j^\top \\ &=v_i\left(\lambda_j w_j^\top\right)+\left(\lambda_iv_i\right)w_j^\top=\left(\lambda_i+\lambda_j\right)v_iw_j^\top\,. \end{align}$$ Since the $n^2$ matrices $v_iw_j^\top$, where $i,j=1,2,\ldots,n$, are linearly independent, we have found all eigenvectors of $F$.
EDIT (Due to Request):
We shall prove that the matrices $v_iw_j^\top$, for $i,j=1,2,\ldots,n$, are linearly independent. Let $K$ be the base field. Suppose that there exist $\kappa_{i,j}\in K$ for $i,j=1,2,\ldots,n$ such that $\sum_{i=1}^n\sum_{j=1}^n\kappa_{i,j}v_iw_j^\top=\boldsymbol{0}_{n\times n}$. Write $w_j=\left(w_j^1,w_j^2,\ldots,w_j^n\right)$ for $j=1,2,\ldots,n$. Hence, $v_iw_j^\top=\begin{bmatrix} w_j^1v_i&w_j^2v_i&\cdots&w_j^nv_i\end{bmatrix}$. Therefore, $\sum_{i=1}\sum_{j=1}^n\kappa_{i,j}v_iw_j^\top=\boldsymbol{0}_{n\times n}$ implies that $$\begin{bmatrix}\displaystyle\sum_{i=1}^n\left(\sum_{j=1}^n\kappa_{i,j}w_j^1\right)v_i & \displaystyle\sum_{i=1}^n\left(\sum_{j=1}^n\kappa_{i,j}w_j^2\right)v_i & \cdots & \displaystyle\sum_{i=1}^n\left(\sum_{j=1}^n\kappa_{i,j}w_j^n\right)v_i \end{bmatrix}=\boldsymbol{0}_{n\times n}\,.$$ Consequently, for $i=1,2,\ldots,n$ and $k=1,2,\ldots,n$, we must have $\sum_{j=1}^n\kappa_{i,j}w_j^k=0$, since the $v_i$'s are linearly independent. That is, $\sum_{j=1}^n\kappa_{i,j}w_j=\boldsymbol{0}_{n\times 1}$ for $i=1,2,\ldots,n$. As the vectors $w_j$'s are linearly independent, $\kappa_{i,j}=0$ for all $i,j=1,2,\ldots,n$, and the result follows immediately.
P.S.:
(1) I think my solution is identical to himbrom's.
(2) This solution works similarly if $F$ is defined via $F(A)=\alpha AB+\beta BA$ for every matrix $A\in\text{Mat}_{n\times n}(K)$, where $\alpha,\beta \in K$ are nonzero. For $i,j=1,2,\ldots,n$, the matrix $v_iw_j^\top$ is still an eigenvector of $F$, but with the eigenvalue $\alpha \lambda_j+\beta \lambda_i$.
(3) It would be an interesting problem to see if the converse holds. Suppose, for fixed $\alpha,\beta \in K \setminus\{0\}$ and for a fixed matrix $B\in\text{Mat}_{n\times n}(K)$, that $F(A)=\alpha AB+\beta BA$ for every matrix $A\in\text{Mat}_{n\times n}(K)$. If $F$ is a diagonalizable linear operator, then does it follow that $B$ a diagonalizable matrix? Does the answer depend on $K$, $\alpha$, and/or $\beta$? For example, in the case where $K$ is algebraically closed of characteristic $0$, $\alpha=1$, and $\beta=-1$, diagonalizability of $F$ is equivalent to that of $B$ (this is a well known result in Lie Algebra). If $\alpha=0$ or $\beta=0$, but not both of them are zero, then $F$ is diagonalizable if and only if $B$ is so.
Solution 2:
Let $e_i$ be the eigenvectors of $B$, i.e. $$Be_i=\lambda_ie_i$$ and let $E_{ij}=e_i e_j^T$ be the elementary matrices in this basis, i.e. $$E_{ij}e_k=\delta_{jk}e_i.$$ As it turns out, $F$ is already diagonal in the $E_{ij}$-basis: \begin{align} &BE_{ij}e_k=\delta_{jk}Be_i=\lambda_i\delta_{jk}e_i=\lambda_iE_{ij}e_k\\ \Rightarrow&F(E_{ij})e_k=BE_{ij}e_k+E_{ij}Be_k=(\lambda_i+\lambda_j)E_{ij}e_k \end{align} The eigenvalues are therefore $\lambda_i+\lambda_j$. (Some of them may coincide, for example if you apply the permutation $i\leftrightarrow{}j$.)