Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

I may have missed the answer above, but does dns|sip mean the IP is added if ‘dns’ or ‘sip’ is in the line? Or does it mean that ‘dns’ and ‘sip’ need to be present for the IP to be added?

Overall, the ‘final’ version is really good. :slight_smile: Thank you!

.
“Thank you”, the folliowing is just intended to lower the threshold of adoption and improve.

Some minor notes, this script:

  • doesn’t have a distinguisable name, which would help finding it, something along the lines of “Shumkov msatter Blacklister” or something
  • has no script versioning in # comments (for clarity, typically a release date v2021.09.11)
  • the code doesn’t contain a comment with origin source reference url (this thread) http://forum.mikrotik.com/t/address-lists-downloader-dshield-spamhaus-drop-edrop-etc/133640/1 so after a million copy pastes and ripoffs you could still find the original version
  • maybe more importantly it doesn’t quote the heirule legend https://project.turris.cz/greylist-data/legend.txt so you could more easily scope your blocking options:
  • amplifiers Easily exploitable services for amplification
    • broken_http Broken inbound HTTP (known services)
    • cryptocoin Cryptocoin miners
    • databases Database servers
    • dns Incoming DNS queries
    • http_scan HTTP/S scans
    • low_ports Low ports (<1024)
    • netbios NetBIOS
    • netis Netis router exploit
    • ntp NTP
    • proxy_scan Scans for HTTP/S and SOCKS proxies
    • remote_access Remote access services (RDP, VNC, etc.)
    • samba Samba (Windows shares)
    • sip SIP ports
    • ssdp SSDP
    • ssh SSH
    • synology Synology NAS
    • telnet Telnet
    • torrent Common Torrent ports
  • especially since the last release post defaults to:
  • heirule=http which strictly isn’t an official heirule, but less obviously matches heirules broken_http and http_scan
    • and uses nolog=1 which hides script progress making the adoption threshold higher
    • has no comment parameter like comment=turris-http so the address list and heirule relation would be more clear
    • as a more adoptable default I suggest something along the lines of
}; # do
$update url=https://project.turris.cz/greylist-data/greylist-latest.csv listname=turris comment=Turris-all timeout=8d
}
  • there are no final instructions on use:
  • add to /system scripts
    • add /system scheduler for script
    • add firewall rule to actually block traffic according to the “turris” address list this scheduled script generates
  • /ip firewall filter add action=drop chain=input comment=“Shumkov msatter Blacklister” interface=eth0 log=yes log-prefix=TURRIS src-address-list=turris
  • I challenge you to explain the origin of the name “hei” rule

Something from the Czech language?

I don’t know.
But explain (find out), not speculate (guess) :wink:

The reason I did not put my name in the script is because of the black cat that is sitting above in the thread.

For example SIP you have often a subscription and only the SIP provider IP address is connection in. If you only allow that IP to connect to your SIP then you won’t need the SIP part of the address list. When you offer SIP service yourself and you don’t know in advance from where you get your connections in then the address list helps.

Where and how you use the script is up to you and we did the work to find a good solution. It is combined knowledge from different people and work spread over years, not olnly in this thread. Mikrotik gives us a basic set of tools and combining those gives in the end a good result.
The scripts we publish is between { } so it can be run from terminal and when the result is not what you are suspected then press arrow up till your reach the to be changed line, and make your change and then press enter.

About the Heirrules and how it is used: http://forum.mikrotik.com/t/address-lists-downloader-dshield-spamhaus-drop-edrop-etc/133640/1

…and the black cat is as always, also underneath.

:slight_smile:

Nope, Chinese:

https://project.turris.cz/greylist-data/legend.txt (see the ascii art character at the beginning)
https://dictionary.hantrainerpro.com/chinese-english/translation-hei_black.htm

Does the “Import was NOT successful!” error appear if there were no changes to the list, when using the noerase= option?

[k@a] > /system script run Advanced-Downloader 
Starting import of address-list: turris
Entries not conditional deleted in address-list: turris
List identified as a IPv4 list
Reading Part: 1 0 - 63999
Reading Part: 2 63488 - 127487
Reading Part: 3 126976 - 190975
Reading Part: 4 190464 - 254463
Reading Part: 5 253952 - 317951
Import was NOT successfull! Check if the list turris is still being maintained.
Restoring backup list: turris
[k@a] >

Also, does the “Restoring backup list” need to happen if noerase= is set?


I changed the list to

 $update url=https://lists.blocklist.de/lists/all.txt listname=BlockList-DE timeout=1d noerase=1

And get this output

/system script run BlockList-DE
Starting import of address-list: BlockList-DE
Entries not conditional deleted in address-list: BlockList-DE
List identified as a IPv4 list
Reading Part: 1 0 - 63999
Reading Part: 2 63488 - 127487
Reading Part: 3 126976 - 190975
Reading Part: 4 190464 - 254463
Reading Part: 5 253952 - 317951
Reading Part: 6 317440 - 381439
Completed reading 1 items into address-list BlockList-DE.

The one entry that gets added is 0.0.0.0, which isn’t in the list.

According to the Turris support, it is updated daily.

Still having this issue…

If I use the original script, it imports around a quarter of it, stops in the IP addresses starting with 150. (64k?)

How can I (we) figure out where/why this doesn’t import?

I read this: https://project.turris.cz/en/greylist/

And look at the dates of the files: https://project.turris.cz/greylist-data/

Is Turris then support wrong on this?

Backup and restore, on failure leaves you with the original list in place.


Had a first look and did not find a reason why it should not be imported so I will have a second look in time.

Second look was a success and the following line worked for me:

$update url=https://lists.blocklist.de/lists/all.txt listname=BlockList-DE delimiter=("\n") timeout=1d noerase=1

Result:

{... $update url=https://lists.blocklist.de/lists/all.txt listname=BlockList-DE delimiter=("\n") timeout=1d noerase=1
{... }
Starting import of address-list: BlockList-DE
Entries not conditional deleted in address-list: BlockList-DE
Using config-line defined delimiter: "
                                      "
Reading Part: 1 0 - 63999
Reading Part: 2 63488 - 127487
Reading Part: 3 126976 - 190975
Reading Part: 4 190464 - 254463
Reading Part: 5 253952 - 317951
Reading Part: 6 317440 - 381439
Completed reading 23347 items into address-list BlockList-DE.

Display of the delimiter is broken by the \n which is a NewLine
Got the error message earlier and will have at that too.

https://view.sentinel.turris.cz/greylist-data/archive/2022/

Shows a new file everyday :slight_smile:

Cool. I was using ‘delimiter’ on the original versions.. I didn’t realize it would work, or even be needed on the newer ‘advanced’ version.
Still only adding the 1 item though.. 0.0.0.0

{
/ip firewall address-list
:local update do={
 :put "Starting import of address-list: $listname"
 :if ($nolog = null) do={:log warning "Starting import of address-list: $listname"}

 :local maxretry 3
 :local retrywaitingtime 120s
 :local retryflag true
 :for retry from=1 to=$maxretry step=1 do={
  :if (retryflag) do={ :set $retryflag false; :set $sounter 0
  :if (retry > 1) do={
   :put "Source file changed. Retring after a $retrywaitingtime wait..."
   :if ($nolog = null) do={:log warning "Source file changed. Retring after a $retrywaitingtime wait..."}
   :delay $retrywaitingtime  }
  
 :local filesize ([/tool fetch url=$url src-address=192.0.2.1 user=$user password=$password keep-result=no as-value]->"total")
 :local start 0
 :local maxsize 64000;	        # reqeusted chunk size
 :local end ($maxsize - 1);	# because start is zero the maxsize has to be reduced by one
 :local partnumber	 ($filesize / ($maxsize / 1024)); # how many chunk are maxsize
 :local remainder	 ($filesize % ($maxsize / 1024)); # the last partly chunk 
 :if ($remainder > 0)    do={ :set $partnumber ($partnumber + 1) }; # total number of chunks
 :if ($heirule != null) do={:put "Using as extra filtering: $heirule"} else={:set $heirule "."}
 # remove the current list completely if "erase" is not present (default setting)
  :if ($noerase = null) do={  
   :if ($timeout = null) do={:set $timeout 00:00:00; :do {:foreach i in=[/ip firewall address-list find list=$listname] do={/ip firewall address-list set list=("backup".$listname) $i }} on-error={} } else={
   :do {:foreach i in=[/ip firewall address-list find list=$listname dynamic] do={/ip firewall address-list set list=("backup".$listname) $i }} on-error={} };                
   :put ("Conditional deleting all".$dynamic." entries in address-list: $listname")
   :if ($nolog = null) do={:log warning ("Conditional deleting all".$dynamic." entries in address-list: $listname")}
  } else={:put "Entries not conditional deleted in address-list: $listname"}; # ENDIF ERASE
 :for x from=1 to=$partnumber step=1 do={
   # get filesize to be compared to the orignal one and if changed then retry
   :local comparesize ([/tool fetch url=$url src-address=192.0.2.1 user=$user password=$password keep-result=no as-value]->"total")
   
#:set $comparesize 5 

   # fetching the chunks from the webserver when the size of the source file has not changed
   # empty array when the source file changed. No processing is done till the next complete retry
   :if ($comparesize = $filesize) do={:set $data ([:tool fetch url=$url src-address=192.0.2.1 user=$user password=$password http-header-field="Range: bytes=$start-$end" output=user as-value]->"data")} else={:set $data [:toarray ""]; :set $retryflag true}
     #:if ($ownposix = null) do={
  # determining the used delimiter in the list if not provided in the config
   # this only run once and so the impact on the import time is low
    :local ipv4Posix	  "^[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}"
    :local ipv4rangePosix "^[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}/[0-9]{1,2}"
    :local domainPosix	  "^.+\\.[a-z.]{2,7}"
    :local sdata $data;     
    :while ([:len $sdata]!=0 && $delimiter = null) do={ # The check on length of $sdata is for if no delimiter is found.
       	:local sline [:pick $sdata 0 [:find $sdata "\n"]]; :local slen [:len $sline];
       	# set posix depending of type of data used in the list
       	:if ($sline ~ $ipv4Posix)	do={:set $posix $ipv4Posix;	 :set $iden "List identified as a IPv4 list"}
       	:if ($sline ~ $ipv4rangePosix)	do={:set $posix $ipv4rangePosix; :set $iden "List identified as a IPv4 with ranges list"}
       	:if ($sline ~ $domainPosix)	do={:set $posix $domainPosix;	 :set $iden "List identified as a domain list"}
       	:if ($sline ~ $posix) do={:put $iden}
      	:if ($sline ~ $posix) do={ # only explore the line if there is match at the start of the line.
	 :do {:if ([:pick $sline 0 ($slen-$send)] ~ ($posix."\$")|| $send > $slen) do={:set $delimiter [:pick $sline ($slen-$send) ($slen-($send-1))]; :set $result true} else={:set $send ($send+1);} } while (!$result);
	}; #IF posix
	:set $sdata [:pick $sdata ([:find $sdata "\n"]+1) [:len $sdata]];
	:if ($delimiter != null) do={:local sdata [:toarray ""]}; #Clear array sdata and it is not needed anymore and triggering so the While to end
    }; #WHILE END $sdata
    :local sdata [:toarray ""] 
   #} else={:put "User defind Posix: $ownposix"; :set $posix $ownposix } ; # ENDIF ownposix = null   
   :if ($posix = null && $delimiter != null) do={:set $posix "."; :put "Using config-line defined delimiter: \"$delimiter\""}; # delimter provided by config line
   :if (!retryflag) do={:put "Reading Part: $x $start - $end"}   
   :if ($timeout = null) do={:local timeout 00:00:00}; # if no timeout is defined make it a static entry.    
   # Only remove the first line only if you are not at the start of list
   
   :if ($start > 0) do={:set $data [:pick $data ([:find $data "\n"]+1) [:len $data]]}
     :while ([:len $data]!=0) do={
       :local line [:pick $data 0 [:find $data "\n"]]; # create only once and checked twice as local variable
       :if ( $line ~ $posix && $line~heirule) do={    
        :do {add list=$listname address=[:pick $data 0 [:find $data $delimiter]] comment=$comment timeout=$timeout; :set $counter ($counter + 1)} on-error={}; # on error avoids any panics        
       }; # if IP address && extra filter if present
      :set $data [:pick $data ([:find $data "\n"]+1) [:len $data]]; # removes the just added IP from the data array
      # Cut of the end of the chunks by removing the last lines...very dirty but it works
      :if ([:len $data] < 256) do={:set $data [:toarray ""]}    
     }; # while

  :set $start (($start-512) + $maxsize); # shifts the subquential start back by 512  
  :set $end (($end-512) + $maxsize); # shift the subquential ends back by 512 to keep the 
  }; # if retryflag
 }; #do for x
 
}; # for retry
 :if ($counter < 1) do={:set $resultline "Import was NOT successfull! Check if the list $listname is still being maintained."} else={:set $resultline "Completed reading $counter items into address-list $listname." } 
 :put $resultline
 :if ($nolog = null) do={:log warning $resultline }
 :if ($counter > 0) do={:do {/ip firewall address-list remove [find where list=("backup".$listname)]} on-error={} } else={
 :do {:foreach i in=[/ip firewall address-list find list=("backup".$listname)] do={/ip firewall address-list set list=$listname $i }} on-error={}
 :put "Restoring backup list: $listname" 
 :if ($nolog = null) do={:log warning "Restoring backup list: $listname"}
 }; # if counter restore on failure and remove on success
}; # do
$update url=https://lists.blocklist.de/lists/all.txt listname=BlockList-DE user="anonymous" password="anonymous" timeout=1d noerase=1
}

# To be used configline settings:
# url=	        https://name.of.the.list
# listname=	name of address-list

# Optinal settings
# timeout=	the time the entry should be active. If omited then static entries are created.
# comment=	puts this comment on every line in the choosen address-list (default: no comment)
# heirule=	this will select on a word on each line if to import or not (default: no heirule)
# noerase=	any value, then the current list is not erased (default: erase)
# ownPosix=	allow to enter a onw regEX posix to be used (not ative at this moment)
# nolog=        any value, then don't write to the log (default: writing to log)

And the output..

system script run blocklist-de 
Starting import of address-list: BlockList-DE
Entries not conditional deleted in address-list: BlockList-DE
List identified as a IPv4 list
Reading Part: 1 0 - 63999
Reading Part: 2 63488 - 127487
Reading Part: 3 126976 - 190975
Reading Part: 4 190464 - 254463
Reading Part: 5 253952 - 317951
Completed reading 1 items into address-list BlockList-DE.

Humm… Sometimes it does work.. Sometimes it doesn’t.. Humm…

Last time I checked Greylist was only each week. You show a link to Sentinel and that is updated more frequently. It looks that the Sentinel file have replaced the Greylist files now.
I did not look into the files.

This a excellent page showing much near realtime data.

https://view.sentinel.turris.cz/?period=1w

The script could be adapted to not leave 0.0.0.0 in the file an -1 the counter to zero. Maybe then you will get a warning.

I think the \n is not found as a delimiter because it is here two characters.

Update:
When testing my default script I got this message: failure: closing connection: <301 Moved Permanently “https:// view.sentinel.turris .cz/greylist-data/”> 217.31.192.69:443 (5) so I have adapt the URL in my script. This because RouterOS is not following 301 redirects.

Yeah, that was where I found the historical data.. :slight_smile:

So I was talking Greylist and you where talking Sentinel and so we where both right. :wink:

Can the ‘type’ be set on line with the URL too?

       	# set posix depending of type of data used in the list
       	:if ($sline ~ $ipv4Posix)	do={:set $posix $ipv4Posix;	 :set $iden "List identified as a IPv4 list"}
       	:if ($sline ~ $ipv4rangePosix)	do={:set $posix $ipv4rangePosix; :set $iden "List identified as a IPv4 with ranges list"}
       	:if ($sline ~ $domainPosix)	do={:set $posix $domainPosix;	 :set $iden "List identified as a domain list"}
       	:if ($sline ~ $posix) do={:put $iden}

Something like

$update url=https://lists.blocklist.de/lists/all.txt listname=BlockList-DE user="anonymous" password="anonymous" timeout=1d posix=ipv4Posix

##Because this line says not active at the moment
# ownPosix=	allow to enter a onw regEX posix to be used (not ative at this moment)
[code]

Or am I looking at the wrong place to find the variable that needs to be set?

When I remember it well I first implemented ownposix and then automatic detection superseded that. I kept the ownposix in there for an reason, I think that is what you want to use it for.

Re-activate this by removeing the # before :if($own…:

     #:if ($ownposix = null) do={
  # determining the used delimiter in the list if not provided in the config

And the same here } else=(:pu…:

    :local sdata [:toarray ""] 
   #} else={:put "User defind Posix: $ownposix"; :set $posix $ownposix } ; # ENDIF ownposix = null

You have then to append a Posix string in the call line: ownposix=“[1]{1,3}\.[0-9]{1,3}\.[0-9]{1,3}\.[0-9]{1,3}”


  1. 0-9 ↩︎

@Shumkov, @rextended and @smatter
You are all incredible. This is a fantastic workaround for the 64K file limit.
was actually able to load this very large list
$update url=https://iplists.firehol.org/files/blocklist_net_ua.ipset listname=blocklist_net_ua delimiter=(“\n”)
took a bit as the list has over 102K entries but within 5min in my RB4011 it was loaded.

Amazing!!!

I’m seeing that lists like this one $update url=https://lists.blocklist.de/lists/all.txt listname=BlockList-DE delimiter=(“\n”) timeout=1d noerase=1
have some IPv6 entries.
would be nice is if the script put those entries on a corresponding IPv6 firewall address list. even same listname.

Hello!
Could be a good idea to integrate TOR Exits?
https://check.torproject.org/torbulkexitlist
https://www.dan.me.uk/torlist/
https://www.dan.me.uk/torlist/?exit

Tried to integrate with $update with no success, address were 1,2,3,4,5,6…