Community discussions

MikroTik App
 
msatter
Forum Guru
Forum Guru
Topic Author
Posts: 2912
Joined: Tue Feb 18, 2014 12:56 am
Location: Netherlands / Nīderlande

Max size of variables still at 4096!? Anwser is NO

Wed Jul 12, 2023 10:07 pm

I was browsing through help.mikrotik and noticed the removal of the notice in scripting about limit for variables in RouterOS.

https://help.mikrotik.com/docs/pages/vi ... d=47579229

v. 29 Apr 04, 2023 16:13 Testing Department Remove deprecated note on variable size limit.

Note: Variable value size is limited to 4096bytes

I tested it and it seems to be something that became reality. What the new limit is I could not find.

My test script is:
:local a "4096 bytes long text"; :for i from=1 to=512 do={set $b ($b.$a); :put $i}; :put [:len $b]
.
.
510
511
512
2110464

There is still an other spot where the 4096 limit is mentioned:

https://help.mikrotik.com/docs/display/ ... remotehost

In this example, the data is uploaded as a file. Important note, since the file comes from a variable, it can only be in size up to 4KB. This is a limitation of RouterOS variables.
 
msatter
Forum Guru
Forum Guru
Topic Author
Posts: 2912
Joined: Tue Feb 18, 2014 12:56 am
Location: Netherlands / Nīderlande

Re: Max size of variables still at 4096!? Anwser is NO

Thu Jul 13, 2023 12:01 am

Example script of reading a file from a webserver directly into a variable and generate a address-list from it.
# Turris Import by Blacklister

# 20210823 new version that directly download from a http(s) server
# 20230712 new variable length allows to read big files in one go 

{
/ip firewall address-list
:local update do={
 :put "Starting import of address-list: $listname"
  :put "Deleting all Dynamic enties in address-list: $listname"
  :if (heirule != null) do={:put "Using as extra filtering: $heirule"}
  :if ($heirule = null) do={:set $heirule "."}
  :local n 0; # counter
  
 # remove the current list completely
 :do { /ip firewall address-list remove [find where list=$listname dynamic]} on-error={};
   :local data ([:tool fetch url=$url output=user as-value]->"data")
   :put "Imported file length $[:len $data] bytes"
     :while ([:len $data]!=0) do={ 
       :local line [:pick $data 0 [:find $data "\n"]]; # create only once and checked twice as local variable
       :if ($line~"^[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}" && $line~heirule) do={
        :set $n ($n+1) 
        :do {add list=$listname address=[:pick $data 0 [:find $data $delimiter]] comment=$description timeout=$timeout} on-error={};
       }; # if IP address && extra filter if present
      :set data [:pick $data ([:find $data "\n"]+1) [:len $data]]; # removes the just added IP from the data array
     }; # while
 :put "Completed importing $listname added/replacing $n lines."
}; # do

$update url=https://view.sentinel.turris.cz/greylist-data/greylist-latest.csv delimiter=, listname=turris timeout=8d heirule=http|smtp
}
Starting import of address-list: testturris
Deleting all Dynamic entries in address-list: testturris
Using as extra filtering: http|smtp
Imported file length 64512 bytes
Completed importing testturris added 993 lines.
 
User avatar
rextended
Forum Guru
Forum Guru
Posts: 12014
Joined: Tue Feb 25, 2014 12:49 pm
Location: Italy
Contact:

Re: Max size of variables still at 4096!? Anwser is NO

Thu Jul 13, 2023 12:55 am

Imported file length 64512 bytes

Sorry, but the limit from a file are stil lthe same: 64512, what is changed on this?
(ignoring the WRITING of a file)
viewtopic.php?t=197190#p1008826
I did not understand (seriously, no joke)...


This create on 6.x and on 7.x one variable of 41.943.040B (~40MB)
:global test "0123456789"
:for i from=1 to=22 do={:set test ($test . $test);:put [:len $test]}
The limit is the memory on the board,
and if the variable is more than ~64KB system/script/environment is unusable and winbox on that session crash or say "ERROR: action failed"
On router with lower memory @ 40MB Console has crashed; please log in again.

I didn't finish the blacklist import script anymore,
but through my method it is possible to read pieces of ~64KB,
combine them all together in a single variable without bothering about the cut pieces between one reading and another,
and therefore finally process the list.



Based on my idea of 2 yerars ago:
viewtopic.php?f=9&t=177530#p872372
Simply do that....

refreshed old glory code

:global thefile ""
{
    :local url        "https://www.iwik.org/ipcountry/US.cidr"
    :local filesize   ([/tool fetch url=$url as-value output=none]->"downloaded")
    :local maxsize    64512 ; # is the maximum supported readable size of a block from a file
    :local start      0
    :local end        ($maxsize - 1)
    :local partnumber ($filesize / ($maxsize / 1024))
    :local reminder   ($filesize % ($maxsize / 1024))
    :if ($reminder > 0) do={ :set partnumber ($partnumber + 1) }
    :for x from=1 to=$partnumber step=1 do={
         :set thefile ($thefile . ([/tool fetch url=$url http-header-field="Range: bytes=$start-$end" as-value output=user]->"data"))
         :set start   ($start + $maxsize)
         :set end     ($end   + $maxsize)
    }
}
:put [:len $thefile]
...for have one unique file on memory of (at the moment I write) 1.031.093B (~1MB)


And for save back the file after the variable (file in memory) is modified?
:execute ":put \$thefile" file=newfile.txt
But really the file is dirty because "put" add everytime 2Bytes (\r\n) at the end



example with your url code

:global thefile ""
{
    :local url        ("https://" . "view.sentinel.turris.cz/greylist-data/greylist-latest.csv") ; # splitted just to not let the forum alter the url
    :local filesize   ([/tool fetch url=$url as-value output=none]->"downloaded")
    :local maxsize    64512 ; # is the maximum supported readable size of a block from a file
    :local start      0
    :local end        ($maxsize - 1)
    :local partnumber ($filesize / ($maxsize / 1024))
    :local reminder   ($filesize % ($maxsize / 1024))
    :if ($reminder > 0) do={ :set partnumber ($partnumber + 1) }
    :for x from=1 to=$partnumber step=1 do={
         :set thefile ($thefile . ([/tool fetch url=$url http-header-field="Range: bytes=$start-$end" as-value output=user]->"data"))
         :set start   ($start + $maxsize)
         :set end     ($end   + $maxsize)
    }
}
:put [:len $thefile]
302116 = ~300KB (at the moment I write and test the script)
 
User avatar
rextended
Forum Guru
Forum Guru
Posts: 12014
Joined: Tue Feb 25, 2014 12:49 pm
Location: Italy
Contact:

Re: Max size of variables still at 4096!? Anwser is NO

Thu Jul 13, 2023 2:01 am

Example script of reading a file from a webserver directly into a variable and generate a address-list from it.
[…]
Starting import of address-list: testturris
Deleting all Dynamic entries in address-list: testturris
Using as extra filtering: http|smtp
Imported file length 64512 bytes
Completed importing testturris added 993 lines.

Using my method is imported the fulll list without problem.

combined code

:global readfile do={
    :local url        $1
    :local thefile    ""
    :local filesize   ([/tool fetch url=$url as-value output=none]->"downloaded")
    :local maxsize    64512 ; # is the maximum supported readable size of a block from a file
    :local start      0
    :local end        ($maxsize - 1)
    :local partnumber ($filesize / ($maxsize / 1024))
    :local reminder   ($filesize % ($maxsize / 1024))
    :if ($reminder > 0) do={ :set partnumber ($partnumber + 1) }
    :for x from=1 to=$partnumber step=1 do={
         :set thefile ($thefile . ([/tool fetch url=$url http-header-field="Range: bytes=$start-$end" as-value output=user]->"data"))
         :set start   ($start + $maxsize)
         :set end     ($end   + $maxsize)
    }
    :return $thefile
}

{
/ip firewall address-list
:local update do={
 :global readfile
 :put "Starting import of address-list: $listname"
  :put "Deleting all Dynamic enties in address-list: $listname"
  :if (heirule != null) do={:put "Using as extra filtering: $heirule"}
  :if ($heirule = null) do={:set $heirule "."}
  :local n 0; # counter
  
 # remove the current list completely
 :do { /ip firewall address-list remove [find where list=$listname dynamic]} on-error={};
### line replaced ###  :local data ([:tool fetch url=$url output=user as-value]->"data")
   :local data [$readfile $url]
   :put "Imported file length $[:len $data] bytes"
     :while ([:len $data]!=0) do={ 
       :local line [:pick $data 0 [:find $data "\n"]]; # create only once and checked twice as local variable
       :if ($line~"^[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}" && $line~heirule) do={
        :set $n ($n+1) 
        :do {add list=$listname address=[:pick $data 0 [:find $data $delimiter]] comment=$description timeout=$timeout} on-error={};
       }; # if IP address && extra filter if present
      :set data [:pick $data ([:find $data "\n"]+1) [:len $data]]; # removes the just added IP from the data array
     }; # while
 :put "Completed importing $listname added/replacing $n lines."
}; # do

$update url=("https://" . "view.sentinel.turris.cz/greylist-data/greylist-latest.csv") delimiter=, listname=turris timeout=8d heirule=http|smtp
}
(URL splitted just to not let the forum alter the url)

Console output:
Starting import of address-list: turris
Deleting all Dynamic enties in address-list: turris
Using as extra filtering: http|smtp
Imported file length 302116 bytes
Completed importing turris added/replacing 4728 lines.
 
tommmikro
just joined
Posts: 1
Joined: Mon Dec 18, 2023 4:31 pm

Re: Max size of variables still at 4096!? Anwser is NO

Mon Dec 18, 2023 9:31 pm

Hi,
I know it’s an old post but I think both of your scripts are useful.
I use to download blacklist IP file from many sites (if you have some affordable site, it’ll be great!) and many times those files are very big!
I found this post but need some help because with the msatter’s script I can download .csv file indicate and it’s correctly imported, but with rextended’s combined code, the script exits with error “failure: Fetch failed with status 206”
During the script’s execution I see the log “Download from https://view.sentinel.turris.cz/greylist-data/greylist-latest.csv to RAM FINISHED“ for 4 or 5 times and the last one is “failure: Fetch failed with status 206”.
How can I resolve?
With this script it’s possible also download .txt files? That is what I really matter
Thank you so much
 
User avatar
inteq
Member
Member
Posts: 412
Joined: Wed Feb 25, 2015 8:15 pm
Location: Romania

Re: Max size of variables still at 4096!? Anwser is NO

Fri Jan 26, 2024 6:12 am

Hi,
I know it’s an old post but I think both of your scripts are useful.
I use to download blacklist IP file from many sites (if you have some affordable site, it’ll be great!) and many times those files are very big!
I found this post but need some help because with the msatter’s script I can download .csv file indicate and it’s correctly imported, but with rextended’s combined code, the script exits with error “failure: Fetch failed with status 206”
During the script’s execution I see the log “Download from https://view.sentinel.turris.cz/greylis ... latest.csv to RAM FINISHED“ for 4 or 5 times and the last one is “failure: Fetch failed with status 206”.
How can I resolve?
With this script it’s possible also download .txt files? That is what I really matter
Thank you so much
Seems this is an error on the webserver side, regarding request range.
I am getting 416 on a test webserver with 137K B and 8770 lines

Who is online

Users browsing this forum: No registered users and 6 guests