AP has 25+ clients all at relative short range (<1km) in area with many other AP’s. (5 AP’s, 8 Backhaul units and 130 clients in only 2km radius! All overshoot by 6 foreign AP’s and 4 distant ones also belonging to me)
Reader can imagine the used 5Ghz band is heavily utilised…
This AP is transmitting at reduced power, 16dBm. This give for worst client -76dBm and for best client -38dBm receipt at client.
Lowering (the lower the better) AP output further is not an option since worse client will than drop off or get interference problems with other AP’s having stronger signals in adjacent channel. (All 5Ghz-10Mhz)
Most clients themselves are at default power (-18dBm LP cards or -24dBm HP cards)
I have now two options:
-
Set all clients power output such they all are recieved by the AP at sort of the same strength which than is basically the strength it recieves the weakest client. (-76 to -78dBm).
This will create situation on some clients that the difference between their recieved signal from AP and the signal they are recieved by the AP has some 20 to 25dBm difference! Can this be a problem? Would a radio work better or worse when its signal output is roughly similar to what it recieves? Or is a big difference any problem? -
Set all clients power such that they send at such power level that their signal is reached the AP at sort of the same level as AP’s signal is recieved by that client. This would mean that the signals recieved by the AP can differ a lot (up to 25dBm!) between one client and another…
Anybody that can give me good argumented advise on this?
The general idea is to lower radio transmission as much as possible without making transmission suffer from foreign (competition) signals that hit stations at -75 to -95dBm levels in sometime very adjacent frequencies.