Imported file length 64512 bytes
Sorry, but the limit from a file are stil lthe same: 64512, what is changed on this?
(ignoring the WRITING of a file)
viewtopic.php?t=197190#p1008826
I did not understand (seriously, no joke)...
This create on 6.x and on 7.x one variable of 41.943.040B (~40MB)
:global test "0123456789"
:for i from=1 to=22 do={:set test ($test . $test);:put [:len $test]}
The limit is the memory on the board,
and if the variable is more than ~64KB system/script/environment is unusable and winbox on that session crash or say "
ERROR: action failed"
On router with lower memory @ 40MB
Console has crashed; please log in again.
I didn't finish the blacklist import script anymore,
but through my method it is possible to read pieces of ~64KB,
combine them all together in a single variable without bothering about the cut pieces between one reading and another,
and therefore finally process the list.
Based on my idea of 2 yerars ago:
viewtopic.php?f=9&t=177530#p872372
Simply do that....
refreshed old glory code
:global thefile ""
{
:local url "https://www.iwik.org/ipcountry/US.cidr"
:local filesize ([/tool fetch url=$url as-value output=none]->"downloaded")
:local maxsize 64512 ; # is the maximum supported readable size of a block from a file
:local start 0
:local end ($maxsize - 1)
:local partnumber ($filesize / ($maxsize / 1024))
:local reminder ($filesize % ($maxsize / 1024))
:if ($reminder > 0) do={ :set partnumber ($partnumber + 1) }
:for x from=1 to=$partnumber step=1 do={
:set thefile ($thefile . ([/tool fetch url=$url http-header-field="Range: bytes=$start-$end" as-value output=user]->"data"))
:set start ($start + $maxsize)
:set end ($end + $maxsize)
}
}
:put [:len $thefile]
...for have one unique file on memory of (at the moment I write) 1.031.093B (~1MB)
And for save back the file after the variable (file in memory) is modified?
:execute ":put \$thefile" file=newfile.txt
But really the file is dirty because "put" add everytime 2Bytes (\r\n) at the end
example with your url code
:global thefile ""
{
:local url ("https://" . "view.sentinel.turris.cz/greylist-data/greylist-latest.csv") ; # splitted just to not let the forum alter the url
:local filesize ([/tool fetch url=$url as-value output=none]->"downloaded")
:local maxsize 64512 ; # is the maximum supported readable size of a block from a file
:local start 0
:local end ($maxsize - 1)
:local partnumber ($filesize / ($maxsize / 1024))
:local reminder ($filesize % ($maxsize / 1024))
:if ($reminder > 0) do={ :set partnumber ($partnumber + 1) }
:for x from=1 to=$partnumber step=1 do={
:set thefile ($thefile . ([/tool fetch url=$url http-header-field="Range: bytes=$start-$end" as-value output=user]->"data"))
:set start ($start + $maxsize)
:set end ($end + $maxsize)
}
}
:put [:len $thefile]
302116 = ~300KB (at the moment I write and test the script)