Max size of variables still at 4096!? Anwser is NO

I was browsing through help.mikrotik and noticed the removal of the notice in scripting about limit for variables in RouterOS.

https://help.mikrotik.com/docs/pages/viewpreviousversions.action?pageId=47579229

v. 29 Apr 04, 2023 16:13 Testing Department Remove deprecated note on variable size limit.

Note: Variable value size is limited to 4096bytes

I tested it and it seems to be something that became reality. What the new limit is I could not find.

My test script is:

:local a "4096 bytes long text"; :for i from=1 to=512 do={set $b ($b.$a); :put $i}; :put [:len $b]
.
.
510
511
512
2110464

There is still an other spot where the 4096 limit is mentioned:

https://help.mikrotik.com/docs/display/ROS/Fetch#Fetch-Sendinginformationtoaremotehost

In this example, the data is uploaded as a file. Important note, since the file comes from a variable, it can only be in size up to 4KB. This is a limitation of RouterOS variables.

Example script of reading a file from a webserver directly into a variable and generate a address-list from it.

# Turris Import by Blacklister

# 20210823 new version that directly download from a http(s) server
# 20230712 new variable length allows to read big files in one go 

{
/ip firewall address-list
:local update do={
 :put "Starting import of address-list: $listname"
  :put "Deleting all Dynamic enties in address-list: $listname"
  :if (heirule != null) do={:put "Using as extra filtering: $heirule"}
  :if ($heirule = null) do={:set $heirule "."}
  :local n 0; # counter
  
 # remove the current list completely
 :do { /ip firewall address-list remove [find where list=$listname dynamic]} on-error={};
   :local data ([:tool fetch url=$url output=user as-value]->"data")
   :put "Imported file length $[:len $data] bytes"
     :while ([:len $data]!=0) do={ 
       :local line [:pick $data 0 [:find $data "\n"]]; # create only once and checked twice as local variable
       :if ($line~"^[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}" && $line~heirule) do={
        :set $n ($n+1) 
        :do {add list=$listname address=[:pick $data 0 [:find $data $delimiter]] comment=$description timeout=$timeout} on-error={};
       }; # if IP address && extra filter if present
      :set data [:pick $data ([:find $data "\n"]+1) [:len $data]]; # removes the just added IP from the data array
     }; # while
 :put "Completed importing $listname added/replacing $n lines."
}; # do

$update url=https://view.sentinel.turris.cz/greylist-data/greylist-latest.csv delimiter=, listname=turris timeout=8d heirule=http|smtp
}
Starting import of address-list: testturris
Deleting all Dynamic entries in address-list: testturris
Using as extra filtering: http|smtp
Imported file length 64512 bytes
Completed importing testturris added 993 lines.

Sorry, but the limit from a file are stil lthe same: 64512, what is changed on this?
(ignoring the WRITING of a file)
http://forum.mikrotik.com/t/the-maximum-size-of-a-read-written-file/167507/1
I did not understand (seriously, no joke)…


This create on 6.x and on 7.x one variable of 41.943.040B (~40MB)

:global test "0123456789"
:for i from=1 to=22 do={:set test ($test . $test);:put [:len $test]}

The limit is the memory on the board,
and if the variable is more than ~64KB system/script/environment is unusable and winbox on that session crash or say “ERROR: action failed
On router with lower memory @ 40MB Console has crashed; please log in again.

I didn’t finish the blacklist import script anymore,
but through my method it is possible to read pieces of ~64KB,
combine them all together in a single variable without bothering about the cut pieces between one reading and another,
and therefore finally process the list.



Based on my idea of 2 yerars ago:
http://forum.mikrotik.com/t/how-to-download-only-one-piece-of-file-at-a-time-with-tool-fetch-and-put-it-inside-a-variable/151020/1
Simply do that…
:global thefile “”
{
:local url “https://www.iwik.org/ipcountry/US.cidr
:local filesize ([/tool fetch url=$url as-value output=none]->“downloaded”)
:local maxsize 64512 ; # is the maximum supported readable size of a block from a file
:local start 0
:local end ($maxsize - 1)
:local partnumber ($filesize / ($maxsize / 1024))
:local reminder ($filesize % ($maxsize / 1024))
:if ($reminder > 0) do={ :set partnumber ($partnumber + 1) }
:for x from=1 to=$partnumber step=1 do={
:set thefile ($thefile . ([/tool fetch url=$url http-header-field=“Range: bytes=$start-$end” as-value output=user]->“data”))
:set start ($start + $maxsize)
:set end ($end + $maxsize)
}
}
:put [:len $thefile]
…for have one unique file on memory of (at the moment I write) 1.031.093B (~1MB)


And for save back the file after the variable (file in memory) is modified?

:execute ":put \$thefile" file=newfile.txt

But really the file is dirty because “put” add everytime 2Bytes (\r\n) at the end



:global thefile “”
{
:local url (“https://” . “view.sentinel.turris.cz/greylist-data/greylist-latest.csv”) ; # splitted just to not let the forum alter the url
:local filesize ([/tool fetch url=$url as-value output=none]->“downloaded”)
:local maxsize 64512 ; # is the maximum supported readable size of a block from a file
:local start 0
:local end ($maxsize - 1)
:local partnumber ($filesize / ($maxsize / 1024))
:local reminder ($filesize % ($maxsize / 1024))
:if ($reminder > 0) do={ :set partnumber ($partnumber + 1) }
:for x from=1 to=$partnumber step=1 do={
:set thefile ($thefile . ([/tool fetch url=$url http-header-field=“Range: bytes=$start-$end” as-value output=user]->“data”))
:set start ($start + $maxsize)
:set end ($end + $maxsize)
}
}
:put [:len $thefile]
302116 = ~300KB (at the moment I write and test the script)

Using my method is imported the fulll list without problem.
:global readfile do={
:local url $1
:local thefile ""
:local filesize ([/tool fetch url=$url as-value output=none]->"downloaded")
:local maxsize 64512 ; # is the maximum supported readable size of a block from a file
:local start 0
:local end ($maxsize - 1)
:local partnumber ($filesize / ($maxsize / 1024))
:local reminder ($filesize % ($maxsize / 1024))
:if ($reminder > 0) do={ :set partnumber ($partnumber + 1) }
:for x from=1 to=$partnumber step=1 do={
:set thefile ($thefile . ([/tool fetch url=$url http-header-field="Range: bytes=$start-$end" as-value output=user]->"data"))
:set start ($start + $maxsize)
:set end ($end + $maxsize)
}
:return $thefile
}

{
/ip firewall address-list
:local update do={
:global readfile
:put "Starting import of address-list: $listname"
:put "Deleting all Dynamic enties in address-list: $listname"
:if (heirule != null) do={:put "Using as extra filtering: $heirule"}
:if ($heirule = null) do={:set $heirule "."}
:local n 0; # counter

remove the current list completely

:do { /ip firewall address-list remove [find where list=$listname dynamic]} on-error={};

line replaced ### :local data ([:tool fetch url=$url output=user as-value]->"data")

:local data [$readfile $url]
:put "Imported file length $[:len $data] bytes"
:while ([:len $data]!=0) do={
:local line [:pick $data 0 [:find $data "\n"]]; # create only once and checked twice as local variable
:if ($line~"[1]{1,3}\.[0-9]{1,3}\.[0-9]{1,3}\.[0-9]{1,3}" && $line~heirule) do={
:set $n ($n+1)
:do {add list=$listname address=[:pick $data 0 [:find $data $delimiter]] comment=$description timeout=$timeout} on-error={};
}; # if IP address && extra filter if present
:set data [:pick $data ([:find $data "\n"]+1) [:len $data]]; # removes the just added IP from the data array
}; # while
:put "Completed importing $listname added/replacing $n lines."
}; # do

$update url=("https://" . "view.sentinel.turris.cz/greylist-data/greylist-latest.csv") delimiter=, listname=turris timeout=8d heirule=http|smtp
}
(URL splitted just to not let the forum alter the url)

Console output:
Starting import of address-list: turris
Deleting all Dynamic enties in address-list: turris
Using as extra filtering: http|smtp
Imported file length 302116 bytes
Completed importing turris added/replacing 4728 lines.


  1. 0-9 ↩︎

Hi,
I know it’s an old post but I think both of your scripts are useful.
I use to download blacklist IP file from many sites (if you have some affordable site, it’ll be great!) and many times those files are very big!
I found this post but need some help because with the msatter’s script I can download .csv file indicate and it’s correctly imported, but with rextended’s combined code, the script exits with error “failure: Fetch failed with status 206”
During the script’s execution I see the log “Download from https://view.sentinel.turris.cz/greylist-data/greylist-latest.csv to RAM FINISHED“ for 4 or 5 times and the last one is “failure: Fetch failed with status 206”.
How can I resolve?
With this script it’s possible also download .txt files? That is what I really matter
Thank you so much

Seems this is an error on the webserver side, regarding request range.
I am getting 416 on a test webserver with 137K B and 8770 lines