Is better a low power WiFi signal without interference or a more powerful signal with interference?
In general you will want to use the quieter band.
Its difficult to remove the characteristics of 5g vs 2.4g out the equation - and may send you down the wrong track, but this answer will attempt to do that.
There are (at least) 2 aspects here. The more obvious one is this - If there is a significant amount of other traffic in the band, regardless of how much you boost your signal, you are still not going to get the throughput you want as each device needs to get some of the frequency space (ie get its turn) - I am, assuming here that everything is transmitting within the legal limit. [ This ties in with the other aspect, but if you shout loudly enough - you will make the other devices somewhat deaf to the quieter devices and get a bit more throughput, but its an un-winnable arms race, particularly because limits are capped. ]
The 2 advantages of a "louder" router don't help you - they are: 1. They can be heard over a wider distance. 2. The sensitivity of the equipment can be lower and it can perform.
CONCLUSION: A less noisy channel with a low power transmitter will outperform a louder channel with more noise.
The other element (which if you go through the logic in enough depth provides the identical conclusions, but from a different point of view) -
The amount of signal that can be transmitted depends on the difference between the signal and noise (ie the "Signal to Noise Ratio"). Assuming the equipment is very sensitive (ie so sensitivity is not an issue - a claim which is "true enough" for WIFI), you are interested in the ratio of signal to noise - so if you have a transmitter transmitting at 100mw and noise equivalent to 10mw (this is a nonsense - sound is actually measured in DB, but it explains the concept in crude terms) - you will get a ratio of 10:1 - On the other hand if you have a 1 watt transmitter with noise equivalent to 333 watt you have ratio of 3:1 - Thus the quieter signal will allow about 3 times the throughput.
One way you can conceptualize it (light works the same way here) - Look at the sky during the day - you will see very, very few stars because the sun is making your eyes insensitive to light. On a dark night, far away from the city you can see a huge number of stars because there is not a big source of light interference - in this example your eyes act as a receiver, the stars transmitters of various strengths and the sun as a source of noise.
this would actually be a better question for electronics.stackexchange.
you should choose the 5ghz spectrum
A 2.4ghz RF signal, due to lower frequency, will penetrate walls and objects more easily.
However, a 5ghz RF spectrum signal has shorter wavelength, and therefor its antenna can be (longer and) better tuned for higher gain within the same space a 2.4ghz antenna would occupy.
Additionally, the 5ghz spectrum has 23 non-overlapping channels, while the 2.4ghz spectrum only has 3. In my experience, there's much less interference, leading to a much cleaner signal, leading to significantly better operation, even more than can gained (oops) from a louder signal.
for now-- until Xfinity/Comcast starts installing 802.11ac/5ghz routers everywhere by default.
finally, if you're running DD-WRT/Tomota-USB or something similar, understand that bumping the radio power will introduce RF harmonics that may hamper the Q-factor of the signal more than help it; not to mention overheat the radio. My wifi radio fried when the cleaning ladies stacked the router/wifi/switch/modem electronics all neatly in one pile.
Interference will kill your throughput. Period.
Therefore, choose the channel without interference.
Also, read about the 'hidden node' problem. It will be exacerbated by strong AP signal.