What is the technical reason for modern graphics cards only supporting 4 monitors?
Generally even a protocol such as DisplayPort requires some kind of transceiver electronics that are completely distinct from the other buses. The way a graphics card talks to a monitor over HDMI or DisplayPort is going to be significantly different from the way it talks to memory or the internal units. Either the voltage will be different, it will have different current usage, or will have error correction to handle going down a cable of questionable quality.
What you are probably seeing is some limitation in tagging hardware streams out to various display transmitters. There may be some complicated multiplexer/switching arrangement to merge MST streams together and the actual number of outputs from the main GPU to the multiplexer could only have 4 ports. There are apparently graphics cards that do support 6 streams so they likely have more ports going out from the GPU to the display transmitter electronics.
It is not a factor of memory bandwidth, nor likely anything to do with ROPs or anything internal to the GPU or memory bus, and it likely boils down to a marketting and technical choice to say that "most users probably only need X number of displays" along with "it's easy enough for the user to buy another card if they need more".