Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Hello!
The new parameter “output=user” provided new scripting capabilities that I decided to take full advantage of.

  • the script does not need third-party servers, since address lists are downloaded directly from the source and processed directly on the router.

  • the script does NOT save the downloaded files to the disk (thereby preventing premature wear and failure of the disk).

  • the script can be adapted to download and process any number of address lists of a similar format (the maximum file size is 63 KiB (64512 bytes). It is better than 4 KiB).

At the moment the script can download and update next lists:

  • DShield
  • Spamhaus DROP
  • Spamhaus EDROP
  • Abuse.ch SSLBL

Variant 1:

ip firewall address-list
:local update do={
:do {
:local data ([:tool fetch url=$url output=user as-value]->"data")
remove [find list=blacklist comment=$description]
:while ([:len $data]!=0) do={
:if ([:pick $data 0 [:find $data "\n"]]~"^[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}") do={
:do {add list=blacklist address=([:pick $data 0 [:find $data $delimiter]].$cidr) comment=$description timeout=1d} on-error={}
}
:set data [:pick $data ([:find $data "\n"]+1) [:len $data]]
}
} on-error={:log warning "Address list <$description> update failed"}
}
$update url=https://www.dshield.org/block.txt description=DShield delimiter=("\t") cidr=/24
$update url=https://www.spamhaus.org/drop/drop.txt description="Spamhaus DROP" delimiter=("\_")
$update url=https://www.spamhaus.org/drop/edrop.txt description="Spamhaus EDROP" delimiter=("\_")
$update url=https://sslbl.abuse.ch/blacklist/sslipblacklist.txt description="Abuse.ch SSLBL" delimiter=("\r")
  • the script deletes all addresses matching the condition “list=blacklist comment=$description”, after which it fills out address lists from scratch. It’s easier and faster.

Variant 2:

ip firewall address-list
:local update do={
:do {
:local data ([:tool fetch url=$url output=user as-value]->"data")
:local array [find dynamic list=blacklist]
:foreach value in=$array do={:set array (array,[get $value address])}
:while ([:len $data]!=0) do={
:if ([:pick $data 0 [:find $data "\n"]]~"^[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}") do={
:local ip ([:pick $data 0 [:find $data $delimiter]].$cidr)
:do {add list=blacklist address=$ip comment=$description timeout=1d} on-error={
:do {set ($array->([:find $array $ip]-[:len $array]/2)) timeout=1d} on-error={}
}
}
:set data [:pick $data ([:find $data "\n"]+1) [:len $data]]
}
} on-error={:log warning "Address list <$description> update failed"}
}
$update url=https://www.dshield.org/block.txt description=DShield delimiter=("\t") cidr=/24
$update url=https://www.spamhaus.org/drop/drop.txt description="Spamhaus DROP" delimiter=("\_")
$update url=https://www.spamhaus.org/drop/edrop.txt description="Spamhaus EDROP" delimiter=("\_")
$update url=https://sslbl.abuse.ch/blacklist/sslipblacklist.txt description="Abuse.ch SSLBL" delimiter=("\r")
  • the script does NOT delete actual addresses, but prolongs their timeout. Addresses that are not in the downloadable list are deleted by the system automatically after their timeout. It’s harder and slower, but it makes it possible to track the date/time of addresses added to the blacklist.
    Why is the script using an “array”?
    Because the default “find” function is VERY slow. Using an additional array allows to speed up the script several times, since operations are performed directly with the indexes, bypassing the default “find” function.

Required policy: read, write, test.
Perhaps this script will be useful to someone.

P.S. Sorry for my English.

Nice Work!

I added FireHOL Level2 to the script as well, in case you’re interested. Just added this line:

$update url=https://raw.githubusercontent.com/ktsaou/blocklist-ipsets/master/firehol_level2.netset description=“FireHOL Level2” delimiter=(“\n”)

-zeb

Hello:

Thank you for sharing。 But the way you write functions is hard to understand. If any boss is rewritten, the written statement is perfect like the official example. Thank you

Hi - This looks great. I will give it a try.

Update -
I just run this and it works great - no errors and works perfectly

What is general recommendation on how often to grab new lists - daily?

Am I correct it removes or ignores duplicate entries?

It would be great to keep this updated with additional!

Thank you so much for this!!!

How does it handle 1.2.3.0/24 addresses and as far I could it enters 1.2.3.0 in the addresslist without the /24?

Update: I ran the script and it does handles the range (cidr) correctly. Going to look if I can add some more lists.

Update 2: excellent script and I have added the option to filter on a specific label in file and that also can be used to remove a list that is not used anymore, from the current blacklist in the addresslist.

This appears to fail for me.

This is the correct syntax

$update url=https://raw.githubusercontent.com/ktsaou/blocklist-ipsets/master/firehol_level2.netset description="FireHOL Level2" delimiter=("\n")

It works if poster zeb put it as code here:

$update url=https://raw.githubusercontent.com/ktsaou/blocklist-ipsets/master/firehol_level2.netset description="FireHOL Level2" delimiter=("\n")

REALLY PLEASED with the script from Shumkov and the added option by Mikrotik and it is now very easy to import lists without having to use other computers to prepare the lists up front

That Level2 list is huge… trying to sort the different levels they have. Any thoughts? Also, would you fun this daily?

Do not forget about file size - maximum 63 KiB.
If the file size is larger than the maximum, only part of the file will be processed (the first 63 KiB), and the rest of the file will be discarded.
FireHOL Level2 is bigger than 63 KiB :slight_smile:

What is general recommendation on how often to grab new lists - daily?

I set the scheduler interval to 8 hours.
In general, the interval depends on the specific list and the frequency of updating this list by its provider.

it removes or ignores duplicate entries?

The script removes only addresses that are in the “blacklist” list and have a comment=description.

Ah - that makes sense. You are quite correct. Thanks for the explanation on the removal.

Are there any other lists you would consider or a good source?

It would be nice if this would be possible using a filter to have only the needed data in the variable. So there would be a lot more space in the variable

:local data ([/tool fetch url=$url output=user as-value~"^[0-9]{1,3}.[0-9]{1,3}.[0-9]{1,3}.[0-9]{1,3}"]->"data");

It looks like FireHOL Level1 may be a better choice and is under the file size limit… barely. Any reason no to use this? That large of a list would probably have a pretty big performance hit on the router?

@Shumkov what was your goal/strategy based on the lists you choose? I am trying to sort what lists should be used and what is a happy medium.

Edit - after taking a closer look it appears the individual sources you are using is very similar to firehol_level1. With a goal of having no false positives this is a great place to start. I guess whether you grab them individually or through firehol is personal preference.

What a great script - thank you very much.

malc0de

$update url=http://malc0de.com/bl/IP_Blacklist.txt description=“Malc0de” delimiter=(“\n”)

It would be nice if this would be possible using a filter to have only the needed data in the variable. So there would be a lot more space in the variable

This does not work :slight_smile:
data” is an element of the array, and is accepted for processing only in its entirety - you cannot process only part of the element.

@Shumkov what was your goal/strategy based on the lists you choose? I am trying to sort what lists should be used and what is a happy medium.

Edit - after taking a closer look it appears the individual sources you are using is very similar to firehol_level1.

That’s right, I took FireHOL Level1 as the basis.
I removed “Feodo Tracker” and “Ransomware Tracker”, replaced “Bambenek C2” with “Bambenek High-Confidence C2” (as Bambenek recommended it myself), and also removed “Fullbogons” - I get them using BGP.

I agree and my angle is to filter traffic (stream) on the way to the data array.

Like this in scripting:

wget -q -O - $url | gawk --posix --field-separator=, '/^[0-9]{1,3}\.[0-9]{1,3}\.[0-9]{1,3}\.[0-9]{1,3}/ { print "$i a=" $1;}'  > $saveTo/$filename

This is something only Mikrotik can create to intercepting the stream.

Makes perfect sense. Thank you again so much for this.

Is there a way to check the file size and have it trigger the email tool if it gets beyond the max file size?

Is there a way to check the file size and have it trigger the email tool if it gets beyond the max file size?

You can try this:

if (([tool fetch url=<url> output=user as-value]->"total")>63) do={tool e-mail send ...}

Thanks you for that.

Do you have a dedicated link the fullbogons piece? I cannot seem to fined a direct url for it?