Ive got around 100 411AH and 433AH boards and I have recently convered alot of the sites over to a batterys + 24V PSU solution to implement a kind of half way solar setup, I use morningstar solar regulars to take the PSU feed and charge the batterys, and then run the load off the load on the regulator for the cut off benefits, there is no issues with this and has been running fine for several weeks, one site even took a 2 day powercut and ran off batterys fine until power was restored.
However what im puzzled with is the differences in voltages being reported, I have been using the Health monitor in RouterOS to check the voltages of the boards, and one bus bar at one site might have 3 boards on it and they all have different voltages displayed, about a +/- 3v of the actual voltage reading. I have measured (with volt meter) the voltage accross the battery terminals and the bus bar stips the power is fed from, and also the dc plugs that supply the injectors and these all read 27.2V constant which is the correct standby current for the batterys.
I know loss in cables, etc can occur - but most of these sites only have a 4-5m cat5 cable to the radio box, and that is fed with a mikrotik passive POE adapter, and that doesent answer for the site that has about a 30m run and is showing a much higher reading.
So is there is reason the voltages seem so different on every board? In one case It is currently reporting 29V on a RB411AH and the voltage there is not one volt over 27.2.
I’ve always wondered how accurate that reading was.
I’m seeing the opposite, I have a 12V supply and my RB450G is reading 10.8V.
I’m also curious how they measure this. Is it a simple A/D in the micro?
What is the reference voltage and how accurate is the reference?
This could be a simple hardware design error or it could also be a simple software error.
The classic error is rounding the per step increment and then multiplying it by the A/D value to get the result.
The higher the A/D value, the higher the net error.
I’m curious to see how well the ROS value tracks across the range of acceptable input voltage, 6-28VDC.
Is it more accurate on the low end or not?
If I have nothing to do sometime soon, I’ll break out my bench supply and make some readings.
How accurate is the temp sensor and where is it on each board that supports it?
Edit: My Fluke measures 12.14V at the DC barrel connector inside the RB450G.
Winbox tells me 10.8V. That’s an 11% error.
We have around 20x RB/333 and RB/433AH out there and we see much the same. It appears to be very hit and miss.
We have measured the input current on the underside of the DC connector and we read near enough to the battery voltage, but the RB health monitor can report as much as 1 - 2v out. This is at 12VDC - on one site we have 3x 433AH and they all read different voltages, and are all incorrect.
Our RB/800’s in our NOC are powered at 12VDC and they report 18v! ROS 4.11
We’ve found that Microstar TS-MPPT-60 work really well as a charge controller & battery monitoring on solar sites
I vote to remove the voltage monitor and invest in decent electrolytic caps!
Besides, I’m not sure what I’d do with the information.
I mean if my supply is 12V and it suddenly drops to 6V, chances are the RB will not be up to report the problem.
“Power’s gone ---- OH, wait, I’m dead and can’t tell you that.”
Deciding what to do when a 12V supply reports 11V, is a little more difficult.
Should I truck roll? Probably not.
I vote to correct both. It could be the same challenge. If the voltage monitor circuit uses a resistor voltage divider circuit to drop the input voltage to the range of the AD converter, the resistors should be 1%, not the 10% (or 20% if cheap) accuracy type. And it only needs to be the two resistors in the voltage ladder. The rest can usually be 10% without problems.
If it is the same challenge, then it is a challenge of design skill. It would be quite concerning if 10% resistors were used unless there were a calibration routine. Assuming stability over temperature, resistor tolerance errors can be easily cal’d out.
More concerning might be the A/D reference. Is it internal or external? How is it generated?
If the reference voltage is not set by a zener diode or other stable voltage controller, either internal or external, shame on them. A 10% resistor supplying current to a regulator like that should not be a problem as a reference.
The voltage monitor on our RouterBOARD devices is giving you only approximate values. If you need accurate measurement for important purposes, use special tester device instead.
Or you could build better RB’s? I mean take any decent motherboard and multimeter and compare BIOS rail voltages to the readings, They are +/- 0.05 volts almost always
We have 2 RB433AH’s powered off the same battery showing 0.8 difference between the 2 boards and even at best thats 1v different from the actual readings.
If you cant include accurate reporting why even bother including it?? Its a real PITA to have to up spec a solar site to handle monitoring gear when the RB’s could do it quite simply
We’re back to this? Juniper’s give you 2 decimal places on their voltage showing the 1.5v, 3.3v, 5v and 12v rails they use.
I’m sure cisco’s they same but I haven’t personally seen it. A good network operator will be monitoring these and have pager triggers for them. A better operator will have a separate device upstream from the RB and use this to confirm this or pinpoint faults.
Whats the point of including voltage monitoring in RB if it reports are waaay off and vary from board to board? I mean we know the board has power which is about all a false voltage report tells us
edit:// You dont have voltage monitoring on all RB’s so why not spend the extra cash on the ones that do and put in a better monitoring circuit
False voltage reading serve no purpose I can think of at all
You can easily tell apart 12, 18, 24 and 48. that’s the basic functionality needed for a routerboard. costs would increase if we would make more fancy stuff. I guess you don’t want the other feature of Juniper (price).
Then why does the voltage show in tenths of a volt? That was a waste of time and programming. I need a voltage on the input. So you feel that 9 volts is ok as an input, as long as the voltage reading show 10.5 volts?
Seriously? The voltage “area” is your answer? If you have to log into a router to tell you what voltage the PSU is then you need to step away from the router and send it back.
And if price is the “issue” then slap another letter on the end of the RB names and charge $10 more for it. Its not exactly rocket science.
I’d pay juniper prices to MT if MT came out with juniper grade hardware and software. Just like i’d shift to Juniper if they came out with a wireless product or series that competes with MT’s stuff
Precision is not a word I would use in this thread I think its pretty clear from the posts here that RB’s often dont report anywhere near the correct voltage (No 1-2v out is not acceptable)
Here’s an easy way out for MT, in 4.12 and 5.0rc2 change the label next to voltage to say Approx Voltage and only show 12v rather than 12.2v