any idea why the script does not like
$update url=http://www.spamhaus.org/drop/drop.txt
Tried delimiter " ;" and “;” and variations to include “\n”
When I press that link I get : Could not connect: (and no further output)
So I could understand why the script does not like such response.
jvanham@cruncher:~$ wget http://www.spamhaus.org/drop/drop.txt
–2022-06-04 07:43:28-- http://www.spamhaus.org/drop/drop.txt
Resolving www.spamhaus.org (www.spamhaus.org)… 104.16.198.238, 104.16.199.238
Connecting to www.spamhaus.org (www.spamhaus.org)|104.16.198.238|:80… connected.
HTTP request sent, awaiting response… 200 OK
Length: 19 [text/html]
Saving to: ‘drop.txt’
drop.txt 100%[==========================================================>] 19 --.-KB/s in 0s
2022-06-04 07:43:28 (2,01 MB/s) - ‘drop.txt’ saved [19/19]
jvanham@cruncher:~$ more drop.txt
Could not connect:
jvanham@Cruncher:~$
I see that the total: is zero and then nothing is imported in the script. The download shows 25KiB so there must be something going wrong. I am currently on RouterOS 7.2RC6
status: connecting
status: finished
downloaded: 25KiB
total: 0KiB
duration: 1s
Why there is a Could not connect: in a direct download is cause by that redirects to and HTTPS. This will be possible in a next version of this script as you can read above.
When in doubt test test the URL in a browser and look at URL in the browser and if content is being showed.
First success with edited script:
{... $update url=https://www.spamhaus.org/drop/drop.txt delimiter=" ;" listname=spamhaus timeout=1d nolog=1
Starting import of address-list: spamhaus
Conditional deleting all entries in address-list: spamhaus
Using config-line defined delimiter: " ;"
Reading Part: 1 0 - 63999
Completed reading 931 items into address-list spamhaus.
Using " ;" as delimiter, space and then a ;
Posted the updated script here: http://forum.mikrotik.com/t/address-lists-downloader-dshield-spamhaus-drop-edrop-etc/133640/142
You will have to add an space before the semicolon and enclose it with " …and don’t forget to change http to https here.
Updated script: Now up to three spaces are detected before the delimiter. Also introcuced an new variable named remarksign. This is needed for lists that use the same character for delimiter and for a start of a comment line.
To find the correct posix the lines which start with the remark/delimter have to removed from from the data begore the correct one can be found.
An other update: cleaned and optimized the script. Also avoiding to download the same file twice.
that did the trick. Thanks for the adjustment @msatter !!
Apologies for asking again, looking to integrate your suggestions for redirect, without success.
In terminal with :put ([$checkurl “https://snort.org/downloads/ip-block-list”]) I was able to read the correct url, any hint on how to easely integrate, with $update url= ?
Probably not useful but testing :put ([:tool fetch url=[$checkurl “https://snort.org/downloads/ip-block-list”] output=user as-value]->“data”) I got all the data, in terminal.
Thanks
You missed adding :global checkurl this because you have to define a global variable before you can use it in a script.
after putting the :global checkurl code at the top of the main script, I use this to download everything
:global testurl [$checkurl url=https://snort.org/downloads/ip-block-list]; $update url=[:pick $testurl 1] listname=“Snort”
seems to work just fine.
Tried to download greensnow
$checkurl url=https://blocklist.greensnow.co/greensnow.txt
cod=666.1;txt=invalid URL protocol
its not a redirect and also fails the basic download via $update
failure: closing connection: 85.236.154.77:443 (4)
loads on a browser ok.
Any ideas?
url= ??? is not on checkurl parameters
Can’t help, still fighting with [$checkur url=https://… but I also had the same error with Greensnow, you can find my updated list http://forum.mikrotik.com/t/address-lists-downloader-dshield-spamhaus-drop-edrop-etc/133640/148
url= ??? is not on checkurl parameters
My posting by me states:
$update url=[$checkurl https://view.sentinel.turris.cz/greylist-data/greylist-latest.csv] delimiter=, listname=turris timeout=8d heirule=http nolog=1
So $update url=[$checkurl htt… and not $update [$checkurl url=htt..
Update: first test with using the function checkurl:
{... $update url=https://snort.org/downloads/ip-block-list listname=snort timeout=1h
{... $update url=https://project.turris.cz/greylist-data/greylist-latest.csv listname=turris delimiter=, heirrule=http timeout=8d
{... $update url=https://view.sentinel.turris.cz/greylist-data/greylist-latest.csv listname=turris delimiter=, heirrule=http timeout=8d
{... $update url=htt ps://lists.blocklist.de/lists/all.txt listname=blockDE timeout=1h nolog=1
{... $update url=http://www.spamhaus.org/drop/drop.txt listname=spamhaus delimiter=";" timeout=1h nolog=1
{... }
There was a problem downloading snort and the list has been ignored!
There was a problem downloading turris and the list has been ignored!
Starting import of address-list: turris
Conditional deleting all entries in address-list: turris
List identified as a IPv4 list
Using delimiter: ","
Reading Part: 1 0 - 63999
Reading Part: 2 63488 - 127487
Reading Part: 3 126976 - 190975
Completed reading 5659 items into address-list turris.
There was a problem downloading blockDE and the list has been ignored!
Starting import of address-list: spamhaus
Conditional deleting all entries in address-list: spamhaus
List identified as a IPv4 with ranges list
Using delimiter: " ;"
Reading Part: 1 0 - 63999
Completed reading 931 items into address-list spamhaus.
Same config but with checkurl function active
Starting import of address-list: snort
Conditional deleting all entries in address-list: snort
List identified as a IPv4 list
Using delimiter: "New Line"
Reading Part: 1 0 - 63999
Completed reading 783 items into address-list snort.
Checking URL...Problem (code): 301 - https://view.sentinel.turris.cz/greylist-data/
address-list turris is not imported. Check log more information
Starting import of address-list: turris
Conditional deleting all entries in address-list: turris
List identified as a IPv4 list
Using delimiter: ","
Reading Part: 1 0 - 63999
Reading Part: 2 63488 - 127487
Reading Part: 3 126976 - 190975
Completed reading 5659 items into address-list turris.
Checking URL...Problem (code): 666.1 - invalid URL protocol
address-list blockDE is not imported. Check log more information
Starting import of address-list: spamhaus
Conditional deleting all entries in address-list: spamhaus
List identified as a IPv4 with ranges list
Using delimiter: " ;"
Reading Part: 1 0 - 63999
Completed reading 931 items into address-list spamhaus.
This then a testing version of the list downloader with support for the :global $checkurl function: The function I put underneath the script and has to be executed so it gets stored in :global. The script runs also without $checkurl . I have adapted the function so that it does not use the files system when there is a straight download (code 200):
{removed
}
checkurl with no usage of filesystem when it is a straight download
{removed
}
I looked also at Greensnow but could not find a cause why RouterOS have trouble with it. As soon as I use output=user then it gives an error.
In addition to importing IP lists, I’m looking for a way to import a domain list, like
https://threatview.io/Downloads/DOMAIN-High-Confidence-Feed.txt
I’ve had pretty good luck using the domain in this type of filtering scheme
In this case, I block 0000.com.my
/ip firewall raw
remove [find comment=“Malicious_domain”]
add action=drop chain=prerouting protocol=tcp dst-port=80,443 content=0000.com.my tls-host=*0000.com.my place-before=0 comment=“Malicious_domain”
anyone seen a script to do this sort of import?
Hello @msatter, still trying to understand your and @rextended suggestions for $checkurl (already tried several times without success, I feel soo dump…), may I ask why you removed
http://forum.mikrotik.com/t/address-lists-downloader-dshield-spamhaus-drop-edrop-etc/133640/1
content?
Thank you
@Simonej: I explained the misunderstanding here: http://forum.mikrotik.com/t/address-lists-downloader-dshield-spamhaus-drop-edrop-etc/133640/175 You need to check the sequencing of the items in that line.
On the removed being showed, the cause is what is currently going on in the forum and the reaction from Mikrotik and/or moderators and the communication of Mikrotik to their users. Having a personal cool-down period and thinking about if this is still a healthy situation.
An huge THANKS from all the readers, your contribution is precious.
Wish you all the best.
PS: Always used $update url=[$checkurl as indicated, not other ways.
greensnow is fully included in firehol level2 anyway
$update url=https://iplists.firehol.org/files/firehol_level2.netset listname=firehol_level2 delimiter=(“\n”) timeout=90d
So which version of the script here is most effective and doesn’t impact disk read/write? Also, what about IPv6?
anyone know why this list fails
$update url=https://feodotracker.abuse.ch/downloads/ipblocklist_recommended.txt listname=FeodoC2 delimiter=(“\n”) timeout=90d