802.11A/G translates to roughly 1.25 bit per hertz spectral efficiency at the highest modulation (54mbps). So a 20mhz channel width, connected at 54mbps will provide about 25mbps net throughput.
If you use a 5mhz channel width, and are connected at 54mbps, you're net throughput would be about 6.25mbps.
You would typically run a 5 or 10mhz channel width if you wanted:
-On a 2.4ghz site, run more then 3 AP's, and didn't want to use overlapping frequencies. The normal 20mhz channel width only gives you 3 non-overlapping channels (1,6,11). With 5 or 10mhz channel widths, the number of non-overlapping channels is quadrupled and doubled (respectively).
-To deal with a site that has quite a bit of interference, and you can't stably use a traditional 20mhz channel width. On the flip side, you have to use 802.11A/G only with 5/10mhz channel widths. You cannot use 802.11B with 5/10mhz channel widths. 802.11A/G does not handle interference as well as 802.11B, and it also requires a much better SnR to achieve it's complex OFDM modulation schemes.
You'll need to do some testing and find out if the reduced amount of bandwidth results in better performance over a traditional 802.11B/20mhz channel width.