Is 8MB in a variable from a txt file is possible?

@optio you really need to cut the chunks because you don’t know if you get chunk-cuts in the middle of the IP address. So I re-introduced the cut-back of 512 bytes and early ends of 256 bytes in $data. To read the complete list I have added that the last chunk does not end early, when 256 bytes are left over.

# Turris Import by Blacklister and edited by Optio
# 20210823 new version that directly downloads from the external server
# 20240331 rewritten to fetch the whole file and write it to a local file and then import it
# 20240401 avoiding perfect storm by reducing chunkSize when calculation the remainder
{
# import config - delay for slow routers
#:delay 1m
/ip firewall address-list
:local update do={
 :put "Starting import of address-list: $listname"
 /tool fetch url=$url dst-path="/$listname.txt" as-value
 # delay to wait file flush after fetch
 :delay 1
 :local filesize [/file get "$listname.txt" size]
 :local start 0
 :local chunkSize 32767;		# requested chunk size
 :local partnumber	($filesize / $chunkSize); # how many chunk are chunkSize
 :local remainder	($filesize % ($chunkSize-512)); # the last partly chunk and use reduced chunkSize
 :if ($remainder > 0) do={ :set partnumber ($partnumber + 1) }; # total number of chunks
 :put "Deleting all Dynamic enties in address-list: $listname"
 :if (heirule != null) do={:put "Using as extra filtering: $heirule"}
 :if ($heirule = null) do={:set $heirule "."}

 # remove the current list completely
 :do {remove [find where list=$listname dynamic]} on-error={};

 :for x from=1 to=$partnumber step=1 do={
   :local data ([:file read offset=$start chunk-size=$chunkSize file="$listname.txt" as-value]->"data")
   # Only remove the first line only if you are not at the start of list
   :if ($start > 0) do={:set data [:pick $data ([:find $data "\n"]+1) [:len $data]]}
   :while ([:len $data]!=0) do={
     :local line [:pick $data 0 [:find $data "\n"]]; # create only once and checked twice as local variable
     :if ($line~"^[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}" && $line~heirule) do={
       :local addr [:pick $data 0 [:find $data $delimiter]]
       #:put "Adding address: $addr"
       :do {add list=$listname address=$addr comment=$description timeout=$timeout} on-error={}; # on error avoids any panics
     }; # if IP address && extra filter if present
     :set data [:pick $data ([:find $data "\n"]+1) [:len $data]]; # removes the just added IP from the data array
     # Cut of the end of the chunks by removing the last lines...very dirty but it works
     :if (([:len $data] < 256) && (x < $partnumber)) do={:set data [:toarray ""]}   
   }; # while

   #:set start ($start + $chunkSize)
   :set start (($start-512) + $chunkSize); # shifts the subquential starts back with 512
   
 }; #do for x
 /file remove "$listname.txt"
 :put "Completed importing $listname."
}; # do
$update url=https://view.sentinel.turris.cz/greylist-data/greylist-latest.csv listname=turris delimiter=, timeout=8d heirule=http
$update url=https://www.spamhaus.org/drop/drop.txt delimiter=("\_") listname=z-blocklist-SpamHaus timeout=2d
$update url=https://www.spamhaus.org/drop/edrop.txt delimiter=("\_") listname=z-blocklist-SpamHaus-edrop timeout=2d
}

@MTNick from the spamhaus topic you need the delimiter=(“_”)
If you have the link for me then I could have a look at it?

@MTNick from the spamhaus topic you need the delimiter=(“_”)
If you have the link for me then I could have a look at it?

Greetings msatter. I have tried that delimiter, several of them, including not listing one. Neither work. Below are the spamhaus links:
$update url=“https://’ . “www.spamhaus.org/drop/drop.txt” delimiter=(”_“) listname=z-blocklist-SpamHaus timeout=2d
$update url=“https://” . “www.spamhaus.org/drop/edrop.txt” delimiter=(”_") listname=z-blocklist-SpamHaus-edrop timeout=2d

**** added " . " to the links for forum formatting purposes ****

@optio You were right. I assumed it was working right. Said it deleted 2934 & entered 2934. Running the new script from msatter above, added more addresses. So, it wasn’t downloading all of them as you said. See the difference in previous download vs new script download. Deleted 3834 & added 4007:
Screen Shot 2024-03-31 at 6.53.26 PM.png


  1. 0-9 ↩︎

It can be done with additional data buffer variable which holds bytes from previevious chunk which are left after last new line character, new data chunk needs to be apended on that data buffer, parsed by lines and matching by regex (single, since in script two are used on same line, regex is expensive for cpu, better to use single string pattern than combining multiple with &&) and then bytes removed up to (and along with) last new line character. No need to seek offset backward and multiple parsing of same data will be avoided.



#$update url=https://view.sentinel.turris.cz/greylist-data/greylist-latest.csv listname=turris delimiter=, timeout=8d heirule=http
$update url=https://www.spamhaus.org/drop/drop.txt delimiter=("\_") listname=z-blocklist-SpamHaus timeout=2d
$update url=https://www.spamhaus.org/drop/edrop.txt delimiter=("\_") listname=z-blocklist-SpamHaus-edrop timeout=2d

I don’t find any problems importing SpamHaus:
z-blocklist-SpamHaus → 1008 entries
z-blocklist-SpamHaus-edrop → 363 entries

I still like the quick-and-dirty method I introduced a few years ago. It is simple and with the omitting of the last early-end it also reads the last line for sure.

Then going back 512 bytes could create a problem with a file that is using all the chunks fully.

 :local chunkSize 32767;		# requested chunk size
 :local partnumber	($filesize / $chunkSize); # how many chunk are chunkSize
 :local remainder	($filesize % ($chunkSize - 512); # the last partly chunk
 :if ($remainder > 0) do={ :set partnumber ($partnumber + 1) }; # total number of chunks

Each subsequent chunk slides back 512 bytes, then also reduce the chunkSize with 512 for calculating the remainder. This should omit the risk of a perfect storm happening.

Greetings msatter. Spamhaus is working great! Thank you!

However, Sentinel (turris) is not importing all of the addresses. If you look at the csv file, there are 10100 addresses. 4007 were deleted & only 2869 were imported. See screenshot below
Screenshot 2024-04-01 at 11.20.48 AM.png

@MTNick that is correct, because of the heirule=http. Only importing lines containing “http” in the word(s) after each line.

If you want to import all, then omit the heirule. This way you can create different address-lists from the same source file.

msatter, Thank you for the explanation of the heirule. Appreciate that. Everything is working as expected. Thank you so much!! Also a Thank You to optio as well. Appreciate the assistance from both of you!

I’ve added the logging to the script, with a few additions. The code for logging was provided by optio. These are label with “:log warning” added mostly right below the “:put” lines

# Turris Import by Blacklister and edited by Optio
# 20210823 new version that directly downloads from the external server
# 20240331 rewritten to fetch the whole file and write it to a local file and then import it
# 20240401 avoiding perfect storm by reducing chunkSize when calculation the remainder
{
# import config - delay for slow routers
:delay 1m
:log warning "IP-Blocker script is running..."
/ip firewall address-list
:local update do={
 :put "Starting import of address-list: $listname"
 :log warning "Starting import of address-list: $listname"
 /tool fetch url=$url dst-path="/$listname.txt" as-value
 # delay to wait file flush after fetch
 :delay 1
 :local filesize [/file get "$listname.txt" size]
 :local start 0
 :local chunkSize 32767;		# requested chunk size
 :local partnumber	($filesize / $chunkSize); # how many chunk are chunkSize
 :local remainder	($filesize % ($chunkSize-512)); # the last partly chunk and use reduced chunkSize
 :if ($remainder > 0) do={ :set partnumber ($partnumber + 1) }; # total number of chunks
 :put "Deleting all Dynamic enties in address-list: $listname"
 :log warning "Deleting all Dynamic entries in address-list: $listname"
 :local listCount [:len [find list=$listname dynamic]]
 :put "Completed deleting $listname, added addresses count: $listCount"
 :log warning "Completed deleting $listname, deleted addresses count: $listCount"
 :if (heirule != null) do={:put "Using as extra filtering: $heirule"}
 :if ($heirule = null) do={:set $heirule "."}

 # remove the current list completely
 :do {remove [find where list=$listname dynamic]} on-error={};

 :for x from=1 to=$partnumber step=1 do={
   :local data ([:file read offset=$start chunk-size=$chunkSize file="$listname.txt" as-value]->"data")
   # Only remove the first line only if you are not at the start of list
   :if ($start > 0) do={:set data [:pick $data ([:find $data "\n"]+1) [:len $data]]}
   :while ([:len $data]!=0) do={
     :local line [:pick $data 0 [:find $data "\n"]]; # create only once and checked twice as local variable
     :if ($line~"^[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}" && $line~heirule) do={
       :local addr [:pick $data 0 [:find $data $delimiter]]
       #:put "Adding address: $addr"
       :do {add list=$listname address=$addr comment=$description timeout=$timeout} on-error={}; # on error avoids any panics
     }; # if IP address && extra filter if present
     :set data [:pick $data ([:find $data "\n"]+1) [:len $data]]; # removes the just added IP from the data array
     # Cut of the end of the chunks by removing the last lines...very dirty but it works
     :if (([:len $data] < 256) && (x < $partnumber)) do={:set data [:toarray ""]}   
   }; # while

   #:set start ($start + $chunkSize)
   :set start (($start-512) + $chunkSize); # shifts the subquential starts back with 512
   
 }; #do for x
 /file remove "$listname.txt"
 :put "Completed importing $listname."
 :local listCount [:len [find list=$listname dynamic]]
 :put "Completed importing $listname, added addresses count: $listCount"
 :log warning "Completed importing $listname, added addresses count: $listCount"
 :put "Completed delete of downloaded file $listname"
 :log warning "Completed delete of downloaded file $listname"
}; # do
$update url=https://iplists.firehol.org/files/firehol_level2.netset delimiter=("\n") listname=z-blocklist-FireHOL-L2 timeout=3d
$update url=https://view.sentinel.turris.cz/greylist-data/greylist-latest.csv listname=z-blocklist-Sentinel delimiter=, timeout=3d heirule=http
$update url=https://www.spamhaus.org/drop/drop.txt delimiter=("\_") listname=z-blocklist-SpamHaus timeout=3d
$update url=https://www.spamhaus.org/drop/edrop.txt delimiter=("\_") listname=z-blocklist-SpamHaus-edrop timeout=3d
}
:log warning message="IP-Blocker script is COMPLETE"

@MTNick optimized logging, and also the displayed text when running in terminal.

Updated: shortend the time a list is unavailable on import by generating a temporary list and then after remove of the old one replacing it with temporary one containing the updated list. Advantage is that the old list is still active during reading the file and while it is being imported.

# Turris Import by Blacklister and edited by Optio
# 20210823 new version that directly downloads from the external server
# 20240331 rewritten to fetch the whole file and write it to a local file and then import it
# 20240401 avoiding perfect storm by reducing chunkSize when calculation the remainder
# 20240402 adding importing new address to temporary list and swap them out with the active list avoiding the list being not active for a short time as possible
# also save and display a count of static addresses present in a address-list
{
# import config - delay for slow routers
#:delay 1m
:log warning "IP-Blocker script started"
/ip firewall address-list
:local update do={
 
 :if (heirule != null) do={:set $filtering ", filtering on: $heirule"}
 :put "Start importing address-list: $listname$filtering"
 :log warning "Start importing address-list: $listname$filtering"
 
 /tool fetch url=$url dst-path="/$listname.txt" as-value
 # delay to wait file flush after fetch
 :delay 1
 :local filesize [/file get "$listname.txt" size]
 :local start 0
 :local chunkSize 32767;		# requested chunk size
 :local partnumber	($filesize / $chunkSize); # how many chunk are chunkSize
 :local remainder	($filesize % ($chunkSize-512)); # the last partly chunk and use reduced chunkSize
 :if ($remainder > 0) do={ :set partnumber ($partnumber + 1) }; # total number of chunks
 
 :local listCount [:len [find list=$listname dynamic]]
 
 :put "Deleting $listCount entries (dynamic) from address-list: $listname"
 :log warning "Deleting $listCount entries (dynamic) from address-list: $listname"

 :if ($heirule = null) do={:set $heirule "."}

 # remove the current dynamic entries completely
 #:do {remove [find where list=$listname dynamic]} on-error={};
 
 :set $listnameTemp ($listname."temp")
 
 :for x from=1 to=$partnumber step=1 do={
   :local data ([:file read offset=$start chunk-size=$chunkSize file="$listname.txt" as-value]->"data")
   # Only remove the first line only if you are not at the start of list
   :if ($start > 0) do={:set data [:pick $data ([:find $data "\n"]+1) [:len $data]]}
   :while ([:len $data]!=0) do={
     :local line [:pick $data 0 [:find $data "\n"]]; # create only once and checked twice as local variable
     :if ($line~"^[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}" && $line~heirule) do={
       :local addr [:pick $data 0 [:find $data $delimiter]]
       :do {add list=$listnameTemp address=$addr comment=$description timeout=$timeout} on-error={}; # on error avoids any panics
     }; # if IP address && extra filter if present
     :set data [:pick $data ([:find $data "\n"]+1) [:len $data]]; # removes the just added IP from the data array
     # Cut of the end of the chunks by removing the last lines...very dirty but it works
     :if (([:len $data] < 256) && (x < $partnumber)) do={:set data [:toarray ""]}   
   }; # while

   #:set start ($start + $chunkSize)
   :set start (($start-512) + $chunkSize); # shifts the subquential starts back with 512
 }; #do for x
 
  /file remove "$listname.txt"
  :put "Deleted downloaded file: $listname.txt"
  :log warning "Deleted downloaded file: $listname.txt"
 
 # Swap out temp list and active list, shorten the time the list is empty
 :do {set list=$listnameTemp [find list=$listname !dynamic]}; # backup any fixed IP addresses to the temporary list
 :do {remove [find list=$listname]} on-error={}; # empty the complete list
 :do {set list=$listname [find list=$listnameTemp]} on-error={
 							:put "Import failed: while swapping out the the old list with the temperorary list: $listname";
 							:log error "Import failed: while swapping out the the old list with the temperorary list: $listname"
 							}
 
 :set $staticCount ""
 :if ([:len [find list=$listname !dynamic]] > 0) do={:set $staticCount "of which $[:len [find list=$listname !dynamic]] are static addresses"}
 
 :if ([:len [find list=$listnameTemp]] < 1) do={
 	:local listCount [:len [find list=$listname]]
 
 	:put "Completed updating address-list $listname with $listCount addresses $staticCount"
 	:log warning "Completed updating address-list $listname with $listCount addresses $staticCount"
 }
 
}; # do
$update url=https://iplists.firehol.org/files/firehol_level2.netset delimiter=("\n") listname=z-blocklist-FireHOL-L2 timeout=3d
$update url=https://view.sentinel.turris.cz/greylist-data/greylist-latest.csv listname=z-blocklist-Sentinel delimiter=, timeout=8d heirule=http
$update url=https://www.spamhaus.org/drop/drop.txt delimiter=("\_") listname=z-blocklist-SpamHaus timeout=3d
$update url=https://www.spamhaus.org/drop/edrop.txt delimiter=("\_") listname=z-blocklist-SpamHaus-edrop timeout=3d

:log warning message="IP-Blocker script COMPLETED running"
}

@msatter Perfect! The above script is awesome. It’s smooth & every step is logged with entry counts as well. Once again, I can’t thank you enough. Thank you for this update & cleaning up the dirt I added to your script :smiley: I know nothing about scripting. But, I’ll attempt anything with some direction.

Quick question, was the way I entered it wrong? Would/could it of caused issues? Just asking for reference & knowledge

I have made an other update to the import now being aware of any static entries added by user to the active list. Those will be saved and a count of them being displayed on import.

It was not “dirt” but more streamlining the text to be more informative to an user.

You are not using the option available in this forum to display code and you added “” to the URL in the $update config line and that interfered with accepting the config line.

Greeting msatter. Thank you. I removed all “” from the URL for all of the lists. I had no idea that this caused issues. Happy to report that everything is working great!

Perhaps interesting for those collecting various sources to feed the scripts.

https://docs.paloaltonetworks.com/resources/edl-hosting-service

Palo Alto also provides for free various curated lists like for M365,Azure,GCP,Zoom etc,etc

Imports work just fine with the current script.

I believe that https://iplists.firehol.org has the most comprehensive collection of IP address lists, statistics, and clickable maps indicating where the crooks are located. Palo Alto is one of many contributors.

Hi all. The script above doesn’t work on MikroTik RouterOS 6.49.13, can it be tweaked to work on v6 or if possible please provide an alternative script that will work. Thank you in advance.

I get this error:
[admin@MikroTik] > system script run test
syntax error (line 41 column 71)
[admin@MikroTik] >

Error on that line is because :file read is introduced in ROS version 7.13. For lower versions script needs to be adapted to use fetch and retrieve data in chunks using Range header, but not with same logic since this script seeks offset backwards 512 bytes from prev. chunk, in such case data from prev. chunk also needs to be stored. I have similar script which is not doing that, it holds in data buffer only bytes from prev. chunk which are not parsed (after newline), such script can be more easier to adapt by fetching in range, maybe I will adapt it when I find time and post it, there are some examples above how to fetch by Range header, if you find it useful, or search forum for other topics which may have such script.

deleted. Started a new thread: https://forum.mikrotik.com/viewtopic.php?t=211726