I’ve been asked to only allow users to access a small subset of domains and IP addresses on the internet (http and https). For example, say: *.acme.com and *.acme2.com and the IP of 12.34.56.78. All told, there are only about 12 domains and IP’s that we want to allow.
I’ve looked at several ways of doing this, but can’t seem to make it work reliably. Obviously, the IP addresses are simple with normal filtering rules, but I’m having real problems with the domain names. I’ve tried layer7 rules and must be doing something wrong. I’ve also tried the web proxy, but couldn’t figure out how to make it proxy/block everything I don’t want to allow through.
Can someone provide me an example or steer me in the right direction?
In such cases it is always best to turn down the request. It is very hard to implement it and very easy to work around
it when your users are creative… but when you really think you want to pursue this and your filter fails to catch https
traffic you can experiment further with the new “TLS host” matcher in firewall filter.
Well, I have to admit that that wasn’t the answer I was hoping for, but thank you for the response. I’ll report back to that effect (and play with the new tls-host matching stuff on my own time).
Please note that such requests, which usally are posted in an obfuscated way like you are doing (*.acme.com) in reality
usually refer to some “website”. E.g. “we only want to allow CNN.COM”.
However, a website normally uses many more domainnames than the one you type in the URL bar, including many names
that are outside the domain (in this case CNN.COM).
To get all but the most rudimentary websites fully functioning, you need quite long list of allowed domains, and
worse: this list is changing at any random moment when the site manager decides to include some new gadget.
So you will have a continuing task of monitoring the functionality of the allowed sites and adapt your filters.