Why must $U(1)$ matrices commute with $SU(2) \times SU(3)$ matrices in embedding within $SU(5)$?
I'm a physicist taking a groups course.
I can believe that the direct sum of the fundmental representations for the $SU(2)$ and $SU(3)$ matrices will work as an embedding of the $SU(2) \times SU(3)$ subgroup within the fundamental representation of $SU(5)$ (since you have a 5x5 matrix composed of a 2x2 block and a 3x3 block so obviously they act on separate vector subspaces as in the direct product).
However it is often said that to embed $SU(2) \times SU(3) \times U(1)$ we can just also use a representation of 5x5 matrices for $U(1)$ which commutes with the above matrices. I don't understand how this commuting constraint means that the $U(1)$ part of the subgroup acts on a 'separate vector space' like the direct product suggests?
Ok, since a hint was not enough, here is an answer.
First of all, please, never use the terminology 'a separate vector space.' It is meaningless. What you really meant to say is that you have a direct sum decomposition ${\mathbb C}^5=V\oplus W$, where $V, W$ are (complex-linear) subspaces of ${\mathbb C}^5$ of dimensions 2 and 3 respectively. Concretely, you can take $$ V=\{(z_1,...,z_5): z_3=z_4=z_5=0\}, W= \{(z_1,...,z_5): z_1=z_2=0\}. $$ These are the subspaces that you call 'separate.' Then you take standard (fundamental) representations of $SU(2)$ (acting on $V$) and of $SU(3)$ (acting on $W$) and define a faithful representation $$ \rho: SU(2)\times SU(3)\to SU(5) $$ sending each pair of matrices $(A, B)$ (with $A\in SU(2), B\in SU(3)$) to the block-diagonal matrix $$ \left[\begin{array}{cc} A&0\\ 0&B\end{array}\right] $$ Given all this, you are effectively asking how to extend the representation $\rho$ to a faithful representation $$ \hat\rho: SU(2)\times SU(3)\times U(1)\to U(5). $$ This is my reading of the sentence "I don't understand how this commuting constraint means that the 𝑈(1) part of the subgroup acts on a 'separate vector space' like the direct product suggests?" (which, on its face, does not make sense even on the level of syntax). I think, your confusion stems from the invalid assumption that in order to construct a faithful representation of a direct product of groups you have to have a corresponding direct sum decomposition of the corresponding vector space.
The extension $\hat\rho$ is constructed by sending each matrix $C\in U(1)$ (which is just a single unitary complex number $t$) to the corresponding diagonal 5-by-5 matrix $$ \hat\rho(t)= \mathrm{diag}(t,t,t,t,t). $$ For matrices $A\in SU(2)$ and $B\in SU(3)$, I use the same representation $\rho$ as above, of course: $$ \hat\rho(A)=\rho(A), \hat\rho(B)=\rho(B). $$
It is an easy exercise to check that $\hat\rho(t)$ commutes with all the matrices $\rho(A), \rho(B)$ as above, thus, resulting in a representation $$ \hat\rho: SU(2)\times SU(3)\times U(1)\to U(5). $$ With a bit more thought, you will check that $\hat\rho$ is faithful. The key to this is the observation that for every two matrices $A, B$ as above and for each $t\in U(1)$, $$ \rho(A,B)=\hat\rho(t) $$ if and only if $t=1$, $A=I_2, B=I_3$. (Incidentally, this argument will fail if I were to try a similar construction with $SU(2)\times SU(2)\times U(1)$ and a 4-dimensional representation.)