Noise floor threshold and ros5.x

It seems that a set noise floor threshold prevents stations to associate to AP when AP is running rosv5.x
Is this a bug or it this a result of the embedding of the nv2 tdma protocol in the 5.x wireless package?

Can anyone or MT shine some light on it?

I have been asking MT this since they released NV2 wireless package for 4.xx.

They keep telling me it is working as intended.

This is why I keep telling everyone with wireless issues to make sure they have Noise floor threshold turned off. It does it no matter what mode you run wireless in, NV2, Nstreme or 802.11

Chadd

Did some test with ros5.1

CPE rb411 has receive signal of -40.
Wireless runs on nv2 with 10Mhz channel bandwith.

Set noise floor threshold to -70 >>>> no difference
Set noise floot threshold to -50 >>>> no difference
Set AP output power such that signal at this CPE lowers to approx -70, CPE nft still at -50 >>> nothing happens
Set AP back to original and set CPE noise floor threshold to -30 (safe mode enabled) >>>>> CPE drops off and only comes back after half an hour (!) and noise floor is reset back to -50.

So, does this now mean it works? Or not? The fact the CPE drops off when a nft is set to a lower value than its recieve signal makes it drop. But when a nft value is set and the signal of the AP falls below its set level the CPE just continues to run…

Lets assume it works: when same CPE is in an environment where scan shows several close freq. channels are in use with signal levels anywhere from -60 to -85 etc and I would set the noise floor threshold of this CPE to -50, would this mean the receiver won’t ´hear´ these other signals any more (so interference is gone?)

This CPE has a solid signal and CCQ but it won’t run traffic. Only 20-50% of what it should. Same configured CPE’s in similar range and environment (but different location, house some 100 meters etc. down the road) have no issues whatsoever.

This same CPE had no issue when AP used different frequency. It worked fine then but then I had other units disconnection due interferences.

I my areay basically all available 5Ghz frequencies, even when several 10Mhz bands are used, are occupied so it is very hard to find for each AP and client a frequency that doesnt give a problem somewhere else…

Hi …

Here seems to be ok on 802.11 legacy modes.

Example: a R52n AP on 2.4, when noise floor threshold is set to auto (e.g. no values on it, will be calibrated if periodic calibration is enabled each 1 minute) gave me a -105dBm value.

A sample CPE (NS2 Loco) listen AP at -62dBm, 48MBps.

The AP listen to this CPE at -67dBm, 36Mbps.

If I set this value mannualy at -100dBm:

CPE listen AP at -62dBm, 48Mbps

AP listen CPE at -72dBm, 24Mbps.

Noise floor threshold impacts on S/N => the way card (driver) selects the speed which CPE must send data to AP.

Insteresting that NF threshold is changing the measured RX level as well, not only the S/N ratio.

Along the time (e.g. last 18 months), several changes on ROS version (and UBNT CPEs firmwares) results in several TX/RX signal level combinations and several TX/RX CCQs. Once, due to firmware/ros glitches I guess, the difference between TX & RX signal levels was 10dB! Why, if AP is 20dBm and CPEs are 20dBm devices? I even thought my UBNT CPEs was tampering the TX power (where they should be 20dBm was 10dBm, for instance).

From the interface wireless print advanced:

noise-floor-threshold=default nv2-noise-floor-offset=default

There is this nv2 offset, for now “default”. May be due to the rx period (no more 25us but some miliseconds) the way to measure noise level has changed on such mode.

Regards.

Noise Floor overides the detected noise floor, if you put the noise floot to -50 and your signal is -50 the device will percieve the SNR as 1 or 0, that will directly impact the algorithm for speed selection, therefor it makes perfect sense, Noise floor should really only be tampered with if really specific environments, and in my opinion is more for “bad reads” from cards

I have a 104mb/s N Card from one of the suppliers
and a R52HN card.

When i scan with the same Grid on the r52Hn i have a noise floor of -109 on the TP-Link i have a Noise Floor of -85 If i use the TP link to connect to an AP i get 1mb/s / 6mb/s (14%/20%ccq) connection, with the R52HN i get a 54/54 (100%/100%ccq) link (its a short link about 100M.

If i go in and set the noise floor on the TPlink to -105 Then i achieve a 54/54 (100%ccq) link.

It is based on this experience that i base my comments listed.

Aha, I always wondered what this calibration was actually doing! So this is (one of) the things is does? It calibrates and set the noise floor level? Ok. Learning everyday! :smiley:

Noise floor threshold impacts on S/N => the way card (driver) selects the speed which CPE must send data to AP.

hmm, I thought to understood the rate selection is more base upon how many ack’s returns were received by AP from station. The influence on the S/N imho is having influence since a bad S/N only means the link is bad and therefore many ack requests/respond packages get lost so AP sets and tries lower data rate…

Insteresting that NF threshold is changing the measured RX level as well, not only the S/N ratio.

I don’t see this happening on my board. Also doesn’t make sense to me. RX is RX. As long as receiver can distinguish radio signal from noise it measures the eirp recieved by the transmitter for that signal. Software in ROS converts it to a level. Off course a software bug could have influence in this case but basically the RX signal strength should not differ as a result of surrounding ´noise´.

[One guy shouting on an empty square will be heard by listener at certain volume. Now, if AC/DC (rock band) is playing on that square and same guy shouts again with same volume listener could hear it at the same level (not considering sound will be damped by crowd) if he had good enough filter to reject all the other new noise surrounding him. A ´listener with a good filter will notice exactly the same volume for the ´shouter´. The physical sound waves generated by the ´shouter´ are not lost. Physically they are still around. You only need to distinguish them from all the rest. Radio wave energy works exactly the same imho. The radio wave is not get lost because other radio wave are around. You only need a good filter to catch them. But than the energy is still the same.]

Along the time (e.g. last 18 months), several changes on ROS version (and UBNT CPEs firmwares) results in several TX/RX signal level combinations and several TX/RX CCQs. Once, due to firmware/ros glitches I guess, the difference between TX & RX signal levels was 10dB! Why, if AP is 20dBm and CPEs are 20dBm devices? I even thought my UBNT CPEs was tampering the TX power (where they should be 20dBm was 10dBm, for instance).

Obviously sometimes something goes wrong in the programming code that make the receiver levels translate into date we get presented in the ROS.

From the interface wireless print advanced:

noise-floor-threshold=default > nv2-noise-floor-offset=default

There is this nv2 offset, for now “default”. May be due to the rx period (no more 25us but some miliseconds) the way to measure noise level has changed on such mode.

You show me again it is worth to check out the terminal more often! :confused: Never saw this. What would it mean? NV2 has its own conversion (receiver to console) or does it really handles nf different? MT should give more clearance on this!

After all this, I am still not sure what a set nf level actually would do on a link that is strong (good signals) but is suffering from interferences or multipath receipts.
If CPE receives AP signal with -45 while a scan shows several other close distance (in freq’s) channels are being picked up at levels ranging from -60 to -95 while noise level for its working freq. is -90 to -95 would setting nf threshold to for instance -60 reduce the change receiver picks up eirp energy from other radios? Or the filters are set so only signals stronger than -60 are allowed to pass to receiver while anything else is filtered out?

I am just wondering if setting nf threshold at lower levels than defaults (= calculated) would sort of ´harden´ radio’s against unwanted interferences from other freq’s?
Most of my links have relative strong signal levels but a part of my network is located in a heavy urbanised environment where sort of all available 5Ghz band frequencies are to be used due topology and amount of users. As a result many radios are also picking up several other signals, sometimes at levels into the -70’s and -60’s!
So one of my main jobs has always been to fight against interferences, hidden nodes, multipath, etc. etc. etc. I want from channel separation to rts/cts to physical radio separation, narrowing bands and playing with virtually all other wireless dynamics and improved the links a lot. But now competition is overshooting my area with even more signals I need to go to next steps to keep my network stable and at the same time honouring bigger data throughputs and lower latencies. Hence I use now nv2 and explore the very last configs I didn’t bother to look at before..
So here is noise floor threshold setting. Firt red it doesn’t work on nv2 and now just discover (your post!) it might have a meaning, only probably different than in normal 802.11 environment…

The exploration journey in the land of wifi technology is not ended…

You know, i’ve noticed this phenomenon on a link i have near me… Sometimes the Signal would display at around -62/-61 dB and then other times i would find it near -71/-72 dB. I have periodic calibration enabled. and was using Nstreme, now using Nv2 protocol.

At first I thought it only calibrate frequencies offsets. But then MT support, I do not remember who, said that in answer to an open ticket.

Look, I’m not sure abt that. I said just because when I set noise floor manually the CPE TX data rate changes … 54 => 48 => 36 etc …

May be some software glitch, may be some pro-active task … kind of “rough speed selection”. Something like “false detected signal” … despite of certain artificially low S/N signal being demodulated accordingly - even with low acks/req packages - part of the algorithm checks S/N and “thought”: but S/N is low for such result … (just wanderin’).

Btw, as I did this manual nf setup on a production unit (there was 32 CPEs connected at that time), I started with the value that status pane on winbox interface shows (auto).

Then I changed to “manual” and decrease it in 3dB steps. Enough to make CPE=>AP speeds change to the nearest robust modulation scheme (or fec, whatever).

I don’t know if 10dB jumps on nf threshold will mask this effect. And yes, even telling the radio that noise is high, signal still there (so the real noise, now tampered).

I don’t know. Everything that’s not syncronized and demodulated is … noise. Even overlapping channels ofdm signals. The way we´re talking makes nf threshold looks like a “noise gate”. Open / close and that’s it.

To bury weak signals into the noise is necessary to change AGC knee, I guess. But doing that where there’s fade & etc … reduce the avialable margin. May be there’s a question: for an ofdm demodulator, which is “better”? White noise (thermal) or some modulated carriers, even 25dB below the main signal?

And it never will :smiley:.

A system that was designed for indoor use, then became outdoor, point to multipoint capable (so we miss the “filtering” we have on PTP full duplex radios & freq band with some govt regulation … this means cost) and etc …

Narrow beams (bigger antennas) helps but costs a lot + wind load on towers. And in urban & dense environement … well, I’m trying to create “micro-cells”, not taking into accouts the CPEs quantity on each AP but limited covereage area (max distance I use is 1.3Km).

Regards;

I fully agree on that. Years ago 2,4 and later 5ghz were considered sort of ´last mile´ solution as last resort to get at least something done. G3 and Wimax were the soluciont they believed in to be the technology capable of all we wish.
Real life shows now other ways. Cellular phone operators networks almost crunch under their popularity and explosion of traffic while it is still not the best solucion for laptops en PC’s etc.
Wimax is dear and licensed and in capabiliteis beeing bypassed by the developments in 802.11.
Apart from that, coverage of both still has many ´black spots´ in even the most industrialised countries so in my opinion 802.11 still stands a good change, even in the long run if they keep on developing the protocols and capabilities :smiley:

Narrow beams (bigger antennas) helps but costs a lot + wind load on towers. And in urban & dense environement … well, I’m trying to create “micro-cells”, not taking into accouts the CPEs quantity on each AP but limited covereage area (max distance I use is 1.3Km).

It’s always a balance between cost and options and needs. In fact I work with some ´micro sells´: A dense villa urbanisation, all concrete houses and ceramic tiled roofs (lots of reflections of signals) with a radius of only 1,5km around a hill and two of its slopess. My towers where not allowed to be protruding the roofs more than some meters and the only location that would reach all (top of the hill) is prohibited and not wanted by habitants… Meaning to reach all houses I need 6 different AP’s with their backhauls. Hence every client will always picks up traces of signals from other AP’.s Some AP’s are only 100mtres away from eachother and the longest AP to client link is never more then 1000 meters. Some clients have a AP only 50 mtrs away around the corner but because a house is in between blocking (and other houses reflecting) the signal I have to point their CPe to another AP a couple of hundred meters away. But signals from this close proximity AP still are around and very strong. It is only that they are usually multipath and very dynamic in strength this CPE cant use is due the unstable caracteristics.

Also, since I have 120 clients here as well I also have to work with more AP’s anyway. One AP would never be able to do the job… each client is entitled to 4/5Mb download…
So yes, from the beginning I worked with directional antenna’s for the clients. Since I recently also have been hamered by signals from other parties I changed already many CPE’s (Mainly nanostations, they are really weak performing in ´noisy´ environments) I use ´shielded´ antennas where the routerboard and its pigtail in the box are protected from other than wanted signals by means of a ´gage of farraday´. Either by using metal boxed antennas (The MT rics are very good in that respect) or I isolate platic embedded box antennas with metal spray paint and/or alu foil. It works very good… and its cheap and simple.

Actually, what I see as a very interesting development in wifi is what http://www.ruckuswireless.com/ is doing. If they can make this work in an outdoor environment and in radius of some km’s this could be the future solution to outperforming many other wireless technologies.
But it is stil very dear and I am not convinced yet about the capabilities of that system in a dense spectrum and city like environment.

Well, we’re in oposite worlds :smiley: … here I have dense vegetal coverage, so 25dBi grid antennas at clients & 16dBi sectors here and there to cover 200 … 600m.

And internet access is sort of commodity. But 3G fails, ADSL fails, cable modems fails, 2 days off for repairs & etc. This area I am covering don’t have such services. But I cover some urban area as well. And I’m pro-active, a network manager :smiley: … normally solves custommer problems in a couple of hours. Actually 80 customers => 2.5 PCs each home. I even take care of some PCs (not personally but I keep my eyes on it) so the customer perception is from a healthy & reliable system.

Hmm, I still using NS2 Loco & NS2 Loco M now mainly because the unitary cost. The CPEs aren’t mine but brought by customers. And the region is not rich so this was the way to solve internet access for them. There is another neighbourhood that may be I can use MT CPEs on 5G but this is 2012 target. For now I need to keep installing my 3 … 5 new ones a week on the existing areas.

My service is low speed but guarantee 270K 24h a day with 1M bursts, so its ok for light use, some videos (I have several common streams for youtube, updates from MS & antiviruses etc) so the perception each custommer have is equivalent 768 … 1.2M plan.

Ah, this is nice. I don’t know if is the same company I was in touch back 2006 … 2008. They had (may be another company) a flat pannel with a 64 x 64 dipole matrix, that beamform antenna pattern for each CPE, on the fly. I remember that the “beam” was equivalent to a grid reflector ~22dBi.

regards