I've been working with a device in a remote location that has an R11e-LTE6 modem installed. I am noticing some oddities in the way cells are selected and I wondered if anyone has any insight as to how the modem/firmware chooses the cell and if this can be modified in any way.
What I am seeing is that cells seem to be chosen that has a good RSSI and RSRP, despite the RSRQ and SINR figures being sub-optimal. This is causing remote cells with a high power TX to be chosen, even though they are on a very congested band and offer lower quality connection. I'm seeing, for example RSRQ of -19.5 dB and SINR of -16 dB being chosen over a connection that has RSRQ -11dB and SINR 2dB. I know these signals are still pretty marginal, but it's a remote location some 12km from the nearest cell. We're sometimes working with cells up to 25km away. I suspect that the modem/firmware is only considering the RSSI and RSRP figures and not RSRQ and SINR. Unfortunately, each set of figures seem to be giving an opposite view as to which cells are strongest. the RSRQ and SNR seem to be more accurate, as you would expect.
I have tried to use cell locking to lock the primary to the band I want (B3@20Mhz - earfcn 1501) and to a specific cell which offers much better performance and doesn't trigger almost constant handoffs. However, if the connection is lost temporarily (for example due to rain fade), the IP link doesn't come up again even though info shows the modem connected to the cell. I think the modem just hangs, but only when cell lock is on. If I remove the locking, the modem recovers fine every time but the connection is not stable and doesn't perform right as it is constantly selecting a primary on band B20@10Mhz on remote and congested cells with a high noise floor.
Any thoughts or experiences on this would be greatly appreciated.