Okay, here's a scenario I've been considering. Let's say I'm covering a hotel or condominium using a 9db sector panel antenna, 120 degrees, 15 degrees vertical beam. Using a SR2 or Senao card I get 26db of output power, making my output power 35db EIRP. However, the laptops on the other side average between 13 and 15db EIRP, not to mention walls in between.
So data from the laptops comes in, gets +9db receive gain, and the card has maybe a -81 signal. That's with a 9db antenna.
Let's say hypothetically, I don't need 15 degrees vert. beamwidth, and I only need say 5 degrees, and I decide to use a 20db 120 degree sector panel instead. TX power is now 16db. The signal that was formerly -81 should now be -70, resulting in a major performance difference, and a much better minimum signal level (-118 vs -107). The laptop itself sees the same signal level, which was good before, but the difference in download speed is astounding due to -70 vs -81.
Has anyone else experienced a change like this, or is this all hypothetical?