Is there an explict expression between $AA^H$ and $vec (A)vec (A)^H$?
Here is the idea: $\mathrm{vec}(A)\mathrm{vec}(A)^H$ is an outer product, that contains all the pairwise products of elements of $A$ (with additionally a complex conjugate in the product, but it doesn't matter here). And $AA^H$ is made of sums of such products. If we can find which elements of $\mathrm{vec}(A)\mathrm{vec}(A)^H$ to add, we can find $AA^H$.
I'll change the notation a bit. Let $A\in M_{n,m}(\Bbb C)$ and $M=AA^H$.
Then, denoting the $i$th row of $A$ by $r_i$, we have $M_{i,j}=r_i\cdot r_j$, where the product is the complex dot product.
Because of this, there is an easy relationship between $AA^H$ and $P=\mathrm{vec}(A^T)\mathrm{vec}(A^T)^H$. Note that the effect of $\mathrm{vec}(A)$ is to write all columns of $A$, stacked as a single column vector. The elements of a given column (and not a row) of $A$ are thus stored closed to each other in $\mathrm{vec}(A)$. Likewise, the elements of a given row of $A$ are thus stored closed to each other in $\mathrm{vec}(A^T)$.
Then, we just have to compute back a sum of some elements of $P$, to find the elements of $AA^H$. Precisely,
$$M_{i,j}=\sum_{k=1}^m P_{k+(i-1)m,\,k+(j-1)m}$$
That is, $M_{i,j}$ is the trace of the $(i,j)$ submatrix block of size $m\times m$.
However, you are not interested in $P$, but instead in $Q=\mathrm{vec}(A)\mathrm{vec}(A)^H$. Quite obviously, the elements of $P$ and $Q$ are the same, but not in the same place. However, you can check the formula is only slightly modified:
$$M_{i,j}=\sum_{k=1}^m Q_{i+(k-1)n,\,j+(k-1)n}$$
$M_{i,j}$ is again the trace of a submatrix of $Q$, but now not a contiguous block.
What about the converse? Knowing $AA^H$, is it possible to find $\mathrm{vec}(A)\mathrm{vec}(A)^H$? The solution is not unique: we can see that $AA^H$ is a Gram matrix, and a vector realization (i.e., $A$) is only known up to a unitary transformation. And a unitary transformation won't preserve the pairwise products found in $\mathrm{vec}(A)\mathrm{vec}(A)^H$.
One such relationship is as follows. Define $f:\{1,\dots,M\}\times \{1,\dots,N\} \to \{1,\dots,MN\}$ by $$ f(i,j) = N(j-1) + i. $$ If vec denotes the standard vectorization operator, then for $1 \leq i,j \leq N$ and $A$ has entries $a_{ij}$ for $1 \leq i \leq M, 1 \leq j \leq N$, then we can write $$ [A]_{ij} = a_{ij} = \operatorname{vec}(A)_{f(i,j)}. $$ Thus, we have $$ [AA^H]_{ij} = \sum_{k=1}^M a_{ik} \bar a_{jk} = \sum_{k=1}^M \operatorname{vec}(A)_{f(i,k)} \overline{\operatorname{vec}(A)}_{f(j,k)} \\ = \sum_{k=1}^M [\operatorname{vec}(A)\operatorname{vec}(A)^H]_{f(i,k),f(j,k)}. $$ If you prefer, this can be expressed very nicely in terms of the partial trace. For a block matrix $$ A = \pmatrix{ A_{11} & \cdots & A_{1N}\\ \vdots & \ddots & \vdots\\ A_{N1} & \cdots & A_{NN}} $$ where $A_{ij} \in \Bbb C^{N \times N}$, define $$ \operatorname{tr}_1(M) = A_{11} + \cdots + A_{NN}. $$ We have $$ AA^H = \operatorname{tr}_1[\operatorname{vec}(A)\operatorname{vec}(A)^H]. $$
In order to prove that the trace formula works in general, it suffices to show that $$ AB^T = \operatorname{tr}_1[\operatorname{vec}(A)\operatorname{vec}(B)^T] $$ holds for arbitrary compatible matrices $A,B$. In fact, it suffices to prove that this holds in the case for $A,B$ of rank $1$ since the formula must then hold for the remaining cases by linearity. Suppose that $$ A = uv^T, \quad B = xy^T. $$ Note that $$ \operatorname{vec}(A) = \operatorname{vec}(uv^T) = v \otimes u, $$ where $\otimes$ denotes the Kronecker product. From there, we have $$ \begin{align} \operatorname{tr}_1[\operatorname{vec}(A)\operatorname{vec}(B)^T] &= \operatorname{tr}_1[(v \otimes u)(y \otimes x)^T] \\ & = \operatorname{tr}_1[(vy)^T\otimes(ux)^T] = (y^Tv) \cdot ux^T. \end{align} $$ On the other hand, $$ \begin{align} AB^T &= (uv^T)(x^Ty)^T \\ & = u(v^Ty) x^T = (v^Ty) \cdot ux^T. \end{align} $$ Once we conclude that $AB^T = \operatorname{tr}_1[\operatorname{vec}(A)\operatorname{vec}(B)^T]$ holds in general, the formula in your case follows by setting $B = \overline{A}$.