Now I understand why I’m not a programmer … AT ALL…I can’t wrap my head around such simple logic.
Please follow my reasoning here and feel free to correct me, I’ll post inline comment starting @@@
{
/ip firewall address-list
:local update do={
:do {
:local result [/tool fetch url=$url as-value output=user]; :if ($result->“downloaded” != “63”) do={ :local data ($result->“data”)
@@@ this line fetches the bottom defined URL and if value of the “downloaded” is not 63 it maps each line received from the $result variable into another local variable “data” @@@
:do { remove [find list=$blacklist comment!=“Optional”] } on-error={}
@@@ find an existant ACL and remove entries which do not have a comment-value set “Optional” @@@
:while ([:len $data]!=0) do={
@@@ So as long as the $data is not empty (as it contained the freshly loaded info from the URL perform statements below @@@
:if ([:pick $data ([:find $data “address=” -1] + [:find $data " list=" -1]]~“((25[0-5]|(2[0-4]|[01]?[0-9]?)[0-9])\.){3}(25[0-5]|(2[0-4]|[01]?[0-9]?)[0-9])\/(3[0-2]|[0-2]?[0-9])”) do={
@@@ So scan each line looking for what is between the head & tail of each message. @@@
@@@ Example below of the fetch @@@
/log info “Loading GR ipv4 address list”
/ip firewall address-list remove [/ip firewall address-list find list=GR]
/ip firewall address-list
:do { add address=2.84.0.0/14 list=GR } on-error={}
:do { add address=5.54.0.0/15 list=GR } on-error={}
:do { add address=5.144.192.0/18 list=GR } on-error={}
:do { add address=5.172.192.0/20 list=GR } on-error={}
:do {add list=$blacklist address=([:pick $data 0 [:find $data $delimiter]])} on-error={}
}
@@@ populate the $blacklist (called here “GRTLD”) and this what I don’t understand => The $data in the above only contain the data between address=X.X.X.X.X/3 list=GR ???
@@@ Why this construction address=([:pick $data 0 [:find $data $delimiter]]) to populate the actual IP/MASK in the ACL ? Probably the $delimeter is of no use here anymore
:set data [:pick $data ([:find $data “\n”]+1) [:len $data]]
@@@ Why the above rule ? Why do you have to “set” data ? You just want to parse (as long as $data!=0 ? right?)
} ; :log warning “Imported address list < $blacklist> from file: $url”
} else={:log warning “Address list: <$blacklist>, downloaded file to big: $url” }
} on-error={:log warning “Address list <$blacklist> update failed”}
}
@@@ above some generic messages depending on exit/error-codes I guess, not really mandatory anyway @@@
Do not think badly about yourself, on other posts I read about a user who called himself an expert programmer and did not solve a minimum problem of logic…
Now I read what you write and I reply.
If you can next time put the script inside code block or is hard to read (for me) thanks.
I indent all because without indenting is near unreadable
I only explain, not correct anything
WARNING for other users: do not use this, is useless
# open the bracket { to test inside a terminal, remove on script
{
# put the context in right... context (bad, must be defined inside function, but now no matter)
/ip firewall address-list
# define update function
:local update do={
# bad start rely immediately on "on-error"...
:do {
:local result [/tool fetch url=$url as-value output=user]
# TRUE: @@@ this line fetches the bottom defined URL and if value of the "downloaded" is not 63 it maps each line
# received from the $result variable into another local variable "data" @@@
:if ($result->"downloaded" != "63") do={
:local data ($result->"data")
# another on-Orror, the remove function in this case can't do any error...
:do { remove [find list=$blacklist comment!="Optional"] } on-error={}
# TRUE: @@@ find an existant ACL and remove entries which do not have a comment-value set "Optional" @@@
:while ([:len $data]!=0) do={
# TRUE: @@@ So as long as the $data is not empty (as it contained the freshly loaded info from the URL perform statements below @@@
:if ([:pick $data ([:find $data "address=" -1] + 8) [:find $data " list=" -1]]~"((25[0-5]|(2[0-4]|[01]\?[0-9]\?)[0-9])\\.){3}(25[0-5]|(2[0-4]|[01]\?[0-9]\?)[0-9])\\/(3[0-2]|[0-2]\?[0-9])") do={
# TRUE: but is better to split on two passages, is more readable @@@ So scan each line looking for what is
# between the head & tail of each message. @@@
# another on-Orror
:do { add list=$blacklist address=([:pick $data 0 [:find $data $delimiter]])} on-error={}
# TRUE: @@@ populate the $blacklist (called here "GRTLD") @@@
# end of if ([:pick $data ...
}
# @@@ and this what I don't understand => The $data in the above only contain the data between address=X.X.X.X.X/3 list=GR
# You do not understand it, because everytime a record is saved, the remaining data replace "data",
# this is why before are present a ^ on front of the regex
# I use another method, instead to lost time modifing everytime the data,
# i move the current "pointer" of the start and the end of the point where regex try to find...
# @@@ Why this construction address=([:pick $data 0 [:find $data $delimiter]]) to populate the actual IP/MASK in the ACL \?
# Probably the $delimeter is of no use here anymore
# I do not write this functions, but all is a mess...
:set data [:pick $data ([:find $data "\n"]+1) [:len $data]]
# @@@ Why the above rule - Why do you have to "set" data - You just want to parse (as long as $data!=0 right)
# parsing do not decrease data, I explain cut and paste of data lines before
# end of while
}
:log warning "Imported address list < $blacklist> from file: $url"
# end of :if ($result->"downloaded" != "63")
} else={:log warning "Address list: <$blacklist>, downloaded file to big: $url" }
# end of general function update
} on-error={ :log warning "Address list <$blacklist> update failed" }
# TRUE: @@@ above some generic messages depending on exit/error-codes I guess, not really mandatory anyway @@@
# end of update function
}
# launch the function update with parameters
# better use " " everytime is not clearly a number, tue or false, yes or not, IP or IP-prefix, and something other now I miss for sure...
$update url=https://www.iwik.org/ipcountry/mikrotik/GR blacklist="GRTLD" delimiter=("\n")
# close the script for the terminal
}
I’ll give you another piece of code… it’s almost finished, actually there is no need for any “on-errors”…
What is missing:
Read the file from server and handle the missing file, wrong response from server, etc. (the hard part)
Create a function and pass the parameters to it
Check the passed parameters if they are correct
Create whitelist, before add the IP / IP prefix check if it is on whitelist, then if is it, no add
Check on add if the ip-prefix is already present inside other IP-prefix already on address-list
Check on add if the ip-prefix is comprehensive of one or more IP-prefix on address-list, remove old(s) and add new bigger.
for security accept only from /12 to /32 prefix. /11 or less on IPv4 is too much big for be true…
8 ) Set an option for put the IP on the address-list but on temporary way (Dynamic) for specified time (from 1 second to near 35 weeks),
this do not export this type of IP on address-list on export or backup
whith this option set, if the address is found again on the imported list, instead to delete it and re-import, have time resetted again (from 1 second to near 35 weeks)
For now the script is intended to work on terminal only, after completion, it needs to be converted to be used on scripts.
Some item are modified for generate invalid values like 5.203.0.0/00 and 31.14.168.0/0 (formwerly valid IP with prefix, but /0 mean “all IPs”…)
# simulation of reading from a file
:global teststr (":do { add address=2.84.0.0/14 list=GR } on-error={}\r\n:do { add address=5.54.0.0/15 list=GR } on-error={}\r\
\n:do { add address=5.144.192.0/18 list=GR } on-error={}\r\n:do { add address=5.172.192.0/32 list=GR } on-error={}\r\
\n:do { add address=5.203.0.0/00 list=GR } on-error={}\r\n:do { add address=31.14.168.0/0 list=GR } on-error={}")
# manually defined, but in the future read as parameters of the function
:global addlist "test"
# add parameter if the entry must be dynamic (only on volatile memory, self-destructing after x seconds/hours/days/etc.)
# or static (keeped on reboot)
# must be added the option to accept from the downloaded address list only IP, only IP prefixes or both. For now it accepts only IP prefixes
# keep previous entries in the address-list or not
:global keep true
:global head "address="
:global tail " list="
# initializing variables (on global because we want test it on terminal, on script can/must be local
:global regexipwithsubnet "((25[0-5]|(2[0-4]|[01]\?[0-9]\?)[0-9])\\.){3}(25[0-5]|(2[0-4]|[01]\?[0-9]\?)[0-9])\\/(3[0-2]|[0-2]\?[0-9])"
:global lenght [:len $teststr]
:global offset [:len $head]
:global actualhead -1
:global actualtail -1
:global testip ""
# move all to the right context to shorten the commands
/ip firewall address-list
# if the previous content is not to be kept, it removes all entries
:if (!($keep)) do={ remove [find where list=$addlist] }
:while ([:typeof $actualtail] != "nil") do={
:set actualhead ([:find $teststr $head $actualtail] + $offset)
:set actualtail [:find $teststr $tail $actualhead]
:if ([:typeof $actualtail] != "nil") do={
:set testip [:pick $teststr $actualhead $actualtail]
# if must be imported a list of IP without prefix, simply check
# :if ([:typeof [:toip $testip]] = "ip") do={
# because for a bug added on newer versions,
# can not test directly if a string is a ip-prefix and ip-prefix do not have function like :toip
# I invented this walkthrough for not use regex, but is hard to understand and I don't know if it stop working on future versions
# :if ([:typeof [[:parse ":return $testip"]] ] = "ip-prefix") do={
:if ($testip ~ $regexipwithsubnet) do={
:if ($testip ~ "\\/0(0|\$)") do={
:log warning "Invalid IP-prefix >$testip<"
} else={
# address list save IP/32 without /32, must search for duplicate without the /32, adding with or without /32 not matter
:if ($testip ~ "\\/32") do={ :set testip [:pick $testip 0 [:find $testip "/32" -1]] }
:if ([:len [find where list=$addlist and address=$testip]] = 0) do={ add list=$addlist address=$testip }
}
}
}
}
I’m probably going to look for a slightly modified way outside of RouterOS, as this RouterOS scripting gives me permanent brain-freeze.
I’ve some 24/7 NAS running anyway that can handle something simple like :
curl -v --stderr - https://www.iwik.org/ipcountry/mikrotik/GR |grep -E -o ‘((25[0-5]|(2[0-4]|[01]?[0-9]?)[0-9]).){3}(25[0-5]|(2[0-4]|[01]?[0-9]?)[0-9])/(3[0-2]|[0-2]?[0-9])’
(I can easily have an iteration loop fetching more TLD’s where required)
This gives me a curated output with what is needed, plain IPv4 prefixes.
Then it’s a matter of “fetching” this curated list on the Mikrotik using the most simple “importer” as there is not that much voodoo to be done.
Stupid me, apparently on the same host the “regular” CIDR prefixes voor IPv4/IPv6 can be found.
The specific URL used by me points to a “mikrotik” section with indeed RouterOS import “ready” entries.
So yeah, we only need to fetch the list below (or any other TLD-code). This is then “safe” (well, as safe as it can be…) to process with the existing scripts as you are not executing any CLI directly.
By opening this, you can also see if the size is too large for RouterOS (65535 bytes / 64K) https://www.iwik.org/ipcountry/
For example, ITaly is 47K, ok,
but USa is 942K and is excesive, can not be imported without first dividing the file into 15 parts
I open the file on hex editor, each ip-prefix are separated with
0x0A / char 10 / \n / New Line
and is not present MS-DOS style
0x0D / char 13 / \r / Carriage Return
The file end with one \n
At this point split the file for each \n and test if the field is a valid ip-prefix before import.
Must be added an extra \n at start and one \n at the end for read also the first and the last line.
Because before first line \n are not present and is not guaranted the \n at the end
The script for terminal I wrote before is still valid,
just specify \n as head and tail (I removed all previous comments for better reading, to see them go to previous post)
## Generic IP address list input
## Based on a script written by Sam Norris, ChangeIP.com 2008
## Edited by Andrew Cox, AccessPlus.com.au 2008
:if ( [/file get [/file find name=sidekiqIPs.txt] size] > 0 ) do={
# Remove exisiting addresses from the current Address list
/ip firewall address-list remove [/ip firewall address-list find list=Plex]
:global content [/file get [/file find name=sidekiqIPs.txt] contents] ;
:global contentLen [ :len $content ] ;
:global lineEnd 0;
:global line "";
:global lastEnd 0;
:do {
:set lineEnd [:find $content "\n" $lastEnd ] ;
:set line [:pick $content $lastEnd $lineEnd] ;
:set lastEnd ( $lineEnd + 1 ) ;
#If the line doesn't start with a hash then process and add to the list
:if ( [:pick $line 0 1] != "#" ) do={
:local entry [:pick $line 0 $lineEnd ]
:if ( [:len $entry ] > 0 ) do={
/ip firewall address-list add list=Plex address=$entry
}
}
} while ($lineEnd < $contentLen)
}
My issue is the following:
URL has the following (example) IPs
54.170.120.91
46.51.207.89
(empty line)
Script generates this:
Flags: X - disabled, D - dynamic
# LIST ADDRESS CREATION-TIME TIMEOUT
0 Plex 54.170.120.91 aug/11/2021 18:28:58
1 Plex 46.51.207.89 aug/11/2021 18:28:58
2 Plex 4.170.120.91 aug/11/2021 18:28:58
So first line is duplicated on last line, with first character truncated.
I am not so good in scripting, and haven’t managed to figure out why this happens.
Also I tried a lot of variants for this script without success.
Found an error when you like to delete a large number of imported IP.
/ip dns static remove [find address=127.0.0.1]
action timed out - try again, if error continues contact MikroTik support and send a supout file (13)
It takes some minute to delete a big list, so I guess the limit for a command to run is passed, so it gives error.
Running the command again remove the rest of the lost. Here MT should increase the timeout to not get this message.
Nah, not really abandoned Just letting thing settle a bit....
For the previous poster on Plex, I do exactly the same using this script (that I recycle and use as generic list-loader)
The only thing is I've added that the script only deletes entries not having the comment "Static".
My Plex ACL contains the AWS IP's, but also entries that I've added manually (guests with fixed IP)
As you see "dynamic=no" I also do not touch/delete any dynamic entries that are created with for example "port-knock" sequences for temporary access.
So this works just fine.
When I have time I finish the script.
That script can import any type of IP list, and help to not insert duplicate or double matched subnet, shrinking also the address list, like:
new 10.0.0.0/24 replace both already existant 10.0.0.0/25 and 10.0.0.0/28 because are included on /24
and do not add, for example, if 10.0.0.0/25 and 10.0.0.0/28 10.0.0.0/24 are already present on addres-list, because /24 include /25 and /28
And what about the splicing to fix the 64K limit ? Eg. if you look at the CN file (and US probably even bigger), it downloads a 530KBytes files with plenty of lines in it.
On Linux a very simple "split -l 3000 " would create a bunch of cna,cnb,cnc,... files and you can choose so it remains within the 64K limit of RouterOS
Is that even possible in RouterOS ? I think many people in the past have already looked at this but I never found and way for it searching through the forum.
I don't think its possible because you always have the start loading the file from the beginning, hence you'll run out of the first 64K entries and cannot look further?
Removing one address in a big address-list is very inefficient. If you make changes in a current address-list create ‘delta’ with you changes.
Importing a new list is removing the whole list, import the delta and then the new list. The delta entries are no overwritten when you use on-error.
O, if you want to exclude a number of countries then you could turn that around and just define which countries you want to accept and exclude the rest.
No one link longer work, all is abandoned because no one pay the service…
But you don’t check what you recommend, do you put random links?
I share all the code, @IntrusDave even if it has stopped, don’t want to share it.
I do it for free, @IntrusDave saw it only as an opportunity to make money …
Sorry, but I feel no need to disclose my stats and financial needs for a service that is free.
I can tell that you 4 servers, 120 honeypots, a CDN, storage and the bandwidth needed for all of it is quite a lot.
I wont be open sourcing the code either. it’s 100% written by my with no use of any open source code.
It has a use to me still and I will be keeping it for myself.
My script do not remove previous entry (unless specified), and check for duplicates, and do more without use on-error on any point.
Also I develop a way to shrink address list removing overlapping entries, if you read my previous posts , you see the scripts and the descriptions.