I have seen small improvements in keeping customers connected with poor signals with the slight loss of throughput using amps
A loss, like you see is due to signal saturation of the G/A modulation and over pre-amplification rx gain.
In order to use a bi-directional amplifier (BDA) with -no- degrading effects, several things have to be understood.....
If the amp does not have some automatic power control that interactively looks at the linearity of the signal, you'll need to adjust your radio's output accordingly. Any compression of the signal will result in any of the QAM based modulations simply not working correctly. You'll fall back to QPSK most of the time, which is more forgiving of non-linearity, but you fall back to 18Mb/sec. You have to keep the signal down 6dB from the amp's compression point to not have throughput issues. This generally means a 1 watt amp is only good for 250mW.
Some auto power control amps also use a quadratic averaging scheme for looking at the signal for max TX power, and this algorithm, which i've shown clearly on my Anritsu analyzer, is way to slow to reacting to changes and will also cause problems with the signal as well.
The second issue is the type of pre-amp used in the BDA. Some use really cheap MMIC components that have 3dB and higher noise figures, and some more don't have any front end filtering to keep out of band receive gain from occurring . What's worse is a amp that has say 17dB of gain on the receive, that's way to hot. A well built BDA will use PHEMT based pre-amplification, with at least a 3 pole ceramic filter on the receive path (6 preferably). The two combined will only end up giving you a real world 10-12dB of receive gain, but at that point, it's not running the radio's receiver over with too much signal or noise, and it's only amplifying what you want it to.
That being said, I fully agree that going with a higher power radio with a higher gain more directional antenna is going to be your best bet for the lion share of applications.
I don't agree with a blanket statement of amps not working at all. Having designed several models of BDA's, I can say with 100% certainty that my points made here are the bulk of the issues. The big problem is that alot of amps truely don't work well or have horrible components in them. Finally most of the installs i've seen and consulted on with amps are being driven into compression, which truely will make things worse. A simple look with a analyzer shows this, and you can drop the tx power of the radio till it clears, then wha-la, things work better then. Problem is most installers don't carry a $20,000 analzyer with them to look at this.
On to the question that started this thread....
Being a Radio Amateur operator as I am here in the US, I run a network of data nodes that link our voice repeaters together over some very long distances, and we're not limited to the same power limitations as normal users are by part 97 FCC rules, so my nodes are running 5 watt BDA's on 802.11A links, but again, to keep compression from occuring, i'm only driving the amps with enough power to just stay under the compression point, with ends up derating the amplifier to only about 1.5 watts, but 1.5watts into a 32dB antenna is still -ALOT- of power. I am using the R52 cards with amps and have spectrum analyzer plots if you would like to see them showing the linearity of the the signal. The R52 is also good for amateur radio use as you can take advantage or portions of the spectrum we're licensed to use.