1: Noise is a product of bandwidth.
Imho noise as we speak about in this forum is just a sum of electromagnetic waves produced by all kind of sources but of a nature we don't have any interest in apart from avoiding it.
In your analogy of the windows and curtain, the amount of absolute ´noise´ differs only because you open or close the shutters. The total amount of signal (´light´) also differs linear with it. The relative comparison between the amount of noise (´UV´) and sigal (`light`) stays the same. This S/N ratio is mentioned in decibels (dB) and it stays the same. So imho it makes no difference for a receiving antenna on what bandwidth it is listening. The S/N stays the same.
The radiowave on the other hand, transmitted by the radio with the same available energy but into a smaller (smaller amplitude of the radio wave?) bandwidth has a higher energy density and therefore at the receiver more field energy is available. Since the background noise in absolute terms stays the same, the S/N ratio increases simply because the Signal strength increases. Minimal S/N levels are needed for stable communications and when talking marginal conditions, an increase in signal due the use of a smaller bandwidth can therefore indeed improve the link quality.
Not because your curtains are more open. No, because your sun puts more energy in the visible spectrum.
2: The SNR forms part of the radio card's data-rate control loop.
True, and we just saw that the signal increased, thus the SNR (S/N).
Higher connection rates can be obtained or sustained as long as they fall within the ´sensitivity´ boundary of the radio receiver. (Most cards have less output power in the higher rates, and the sensitivity also decreases a bit. For a 6mbps conn. rate in general less signal is workable then the 54mbps.
But, since the bandwidth is halved in 10Mbps, or even quartered in 5Ghz the available data throughput also gets halved or quartered.
In marginal situations it can be that by using narrower bandwidth radio channel the overall link quality improves to such extend the final gain in throughput is bigger than the loss of theoretical data throughput due the halving or quartering of the channel.
[I've had two clients on a 20Mhz channel both getting -78 to -83 signal in the 5Ghz band and due this only 6 adn 12 Mbps conn. rates gave some connectivity where the link still dropped at times.
I altered the link to 10Mhz bandwidth and gained some 2dB-2,5dB on the signal at the clients. The links stopped dropping and the CCQ went up from a lousy 30-40% to 80-90%.
Clients were allowed to use 3Mb download and complained about poor internet and the impossibility of using skype most of the day. After the change they had good internet speeds and Skype only had occasional problems thereafter.]
what I am trying to say, is that in less than optimum condition (ie with noise), the throughput could go up with a narrower bandwidth, than with a wide bandwidth and lots of noise!!!!!!!!
I can only second this.
All this doesn't give me the answer to what the effect of the use of a narrower channel width will be in relation to the Fresnel zone of the radio link.
Fresnel zone is depending on the used frequency and the link distance, but I am not sure what its relation is to the channel width. I am trying to find some info on the internet but so far did not yet found any in relation to this question.
Anybody that might shine a light on this?