Minimal and characteristic polynomials on tensor product spaces
Solution 1:
This is a compilation of @darij grinberg's comments, which provide a partial answer to the question.
There is an operation that takes a monic polynomial $P$ of degree $n$ and a monic polynomial $Q$ of degree $m$ and returns a monic polynomial $R$ of degree $nm$ whose roots are the products of each root of $P$ with each root of $Q$ (in a suitable extension of $k$). If $P$ is the characteristic polynomial of $\varphi$, and $Q$ is that of $\psi$, then $R$ is that of $\varphi \otimes \psi$. Writing the $\ell$-th coefficient of $R$ in terms of those of $P$ and $Q$ (without speaking of roots) boils down to expanding the internal coproduct $\Delta_{\times} e_{\ell}$ of the elementary symmetric function $e_{\ell} \in \Lambda$ (where $\Lambda$ is the ring of symmetric functions in infinitely many variables) in the basis $(e_{\lambda} \otimes e_{\mu})$ (with $\lambda$ and $\mu$ ranging over partitions) of the tensor product $\Lambda \otimes \Lambda$. This can be done: $$\Delta_{\times} e_{\ell} = \sum_{\lambda \vdash \ell} s_{\lambda} \otimes s_{\lambda^t},$$ where $\lambda^t$ denotes the transpose of an integer partition $\lambda$, and $s_{\lambda}$ is the Schur function corresponding to ${\lambda}$.
The Schur functions can be written in terms of the elementary symmetrics using the von Nägelsbach-Kostka identities. This is probably going to be a mouthful of new words to you if you don't have an algebraic combinatorics background; I don't think anything simpler works, though. I have sketched a proof of the formula for $\Delta_{\times} e_{\ell}$ in the solution of Exercise 2.74(b) in Vic Reiner's notes on Hopf algebras in combinatorics, but this is probably not a good source on internal comultiplication.) This was about the characteristic polynomial; I can't say anything about the minimal polynomial.
On your question about $\psi$ being the identity matrix... well, that makes things simpler. Tensoring $\varphi$ with an $m \times m$ identity matrix is equivalent (in the sense that the results will be conjugate to each other) as taking the "block-diagonal" direct sum $\varphi^{\oplus m} \colon V^{\oplus m} \to V^{\oplus m}$. So the characteristic polynomial will be the $m$-th power of that of $\varphi$, and the minimal polynomial will be that of $\varphi$ (as long as $m \not= 0$).
Actually, my taking of internal coproducts was overkill. It is enough to look at the dual Cauchy identity: $$ \prod_{i=1}^n\prod_{j=1}^m (1+x_iy_j)=\sum_{\lambda} s_{\lambda}(x_1,x_2,...,x_n)s_{\lambda^t} (y_1,y_2,...,y_m),$$ where the sum is over all partitions $\lambda$. (If you want, you can restrict it to partitions $\lambda$ whose largest part is $\leq m$ and whose length is $\leq n$; all other partitions contribute vanishing addends.) If you substitute $y_j t$ for $y_j$, you obtain $$ \prod_{i=1}^n\prod_{j=1}^m (1+x_iy_jt)=\sum_{\lambda} s_{\lambda}(x_1,x_2,...,x_n)s_{\lambda^t} (y_1,y_2,...,y_m)t^{|\lambda|},$$ and you should recognize the left hand side as the "reversed" characteristic polynomial of the tensor product of a matrix whose "reversed" characteristic polynomial is $\prod_{i=1}^n (1+x_i t)$ and a matrix whose "reversed" characteristic polynomial is $\prod_{j=1}^m (1+ y_j t)$.
By "reversed" characteristic polynomial, I mean $\det(I_n+At)$ (where $A$ is the matrix and $n$ is its size), as opposed to the usual characteristic polynomial $\det(tI_n−A)$. The $x_i$'s and $y_j$'s are the coefficients of the respective reversed characteristic polynomials, or (up to sign and order) those of the usual characteristic polynomials, of $\varphi$ and $\psi$.