$\frac{\partial}{\partial X_{ij}} \sum_k\sum_l\sum_m\sum_n X_{lk}C_{lm}X_{mn}N_{nk}=\sum_m\sum_n C_{im}X_{mn}N_{nj} + \sum_k\sum_l X_{lk}C_{li}N_{jk}$
Solution 1:
$ \def\d{\delta}\def\o{{\tt1}}\def\p{\partial} \def\B{\Big}\def\L{\left}\def\R{\right} \def\LR#1{\L(#1\R)} \def\BR#1{\B(#1\B)} \def\trace#1{\operatorname{Tr}\LR{#1}} \def\qiq{\quad\implies\quad} \def\grad#1#2{\frac{\p #1}{\p #2}} \def\S#1{\sum_{#1}} \def\c#1{\color{red}{#1}} $You are almost using index notation, i.e. the most powerful way to approach such problems.
The next step is to adopt the Einstein summation convention, wherein a repeated index implies summation over that index, so that explicit summation symbols can be omitted. $$\eqalign{ \S k\S l\S m\S n X_{lk}C_{lm}X_{mn}N_{nk} \qiq X_{lk}C_{lm}X_{mn}N_{nk} }$$ The next thing to master is the behavior of the Kronecker delta under summation $$\eqalign{ A_{ij}\d_{jk} &= A_{ik} &= A_{ki}^T }$$ and the matrix self-gradient $$\eqalign{ \grad{X_{lk}}{X_{ij}} &= {\d_{il}\d_{jk}} }$$ All of this machinery allows the current problem to be dealt with very mechanically $$\eqalign{ \grad{\trace{X^TCXN}}{X_{ij}} &= \grad{\LR{X_{lk}C_{lm}X_{mn}N_{nk}}}{X_{ij}} \\ &= \c{\d_{il}\d_{jk}}C_{lm}X_{mn}N_{nk} + X_{lk}C_{lm}\c{\d_{im}\d_{jn}}N_{nk} \\ &= C_{im}X_{mn}N_{nj} + X_{lk}C_{li}N_{jk} \\ &= C_{im}X_{mn}N_{nj} + C_{il}^TX_{lk}N_{kj}^T \\ &= CXN + C^TXN^T \\ }$$ As well as the quartic problem $$\eqalign{ \grad{\trace{X^TXX^TX}}{X_{ij}} &= \grad{\LR{X_{kn}X_{kl}X_{ml}X_{mn}}}{X_{ij}} \\ \\ &= \c{\d_{ik}\d_{jn}}X_{kl}X_{ml}X_{mn} \\ &+\; X_{kn}\c{\d_{ik}\d_{jl}}X_{ml}X_{mn} \\ &+\; X_{kn}X_{kl}\c{\d_{im}\d_{jl}}X_{mn} \\ &+\; X_{kn}X_{kl}X_{ml}\c{\d_{im}\d_{jn}} \\ \\ &= X_{il}X_{ml}X_{mj} + X_{in}X_{mj}X_{mn} \\ &+\; X_{kn}X_{kj}X_{in} + X_{kj}X_{kl}X_{il} \\ \\ &= X_{il}X_{lm}^TX_{mj} + X_{in}X_{nm}^TX_{mj} \\ &+\; X_{in}X_{nk}^TX_{kj} + X_{il}X_{lk}^TX_{kj} \\ \\ &= 4XX^TX \\ }$$