bandwidth vs throughput in wired/wireless network
Solution 1:
Bandwidth == Theoretical maximum of the transmission media. AKA Channel capacity.
Throughput == The time it takes for data of a given size to be transmitted, I.e. Real World speed.
Why is it that way? Because there are things like crosstalk, error rates, alpha particles and the filthy, filthy lies that ISPs tell about their bandwidth capabilities. Even on a LAN you'll never see pure 100Mb or 1Gb speeds due to environmental factors, broadcast traffic, etc.
As for your last question, IMO an 802.11 wireless network having the same bandwidth rating will have less throughput because of the increased effects of environmental interference as well as inherent limitations with CSMA/CA. If you're referring to different wireless transfer technologies such as microwave backhauls, you'll have to wait for someone else with expertise in that area to address the topic.
Solution 2:
If you compare, say '54 Mbps' 802.11 with, say, a fiber ethernet internet link configured for 54 Mbps, you will find that the ethernet is faster. That's because 802.11-style wireless doesn't really have a defined bandwidth, it can vary all over the place. The maximum throughput of '54 Mbps' 802.11 is only about 27 Mbps anyway, and even then you essentially never see that in practice. Bluetooth and ZigBee are the same; you need to look hard to find out what you'll get from them, and it depends a lot on circumstances.
The '54 Mbps' actually refers to the raw symbol rate of the bodies (not the headers) of the fastest packets the system is able to use. But, that fastest rate is very seldom used, and even then the headers are output at a much slower rate and there is time between the packets for the collision avoidance algorithm to happen.
Point to point microwave, on the other hand, is exactly like ethernet, up until the point where it just stops due to weather or other factors. In other words, the way it is specified tells you exactly what you will get.