I am looking for a way for the router to connect and cache all content from a website down to four or five sub folders (tree) automatically. As if the router itself were to be browsing and caching content.
http://www.exampleweb.com loads index.html and within index.html there are links to product1.html, product2.html and so on. The idea is that the router could follow the links and download them.
The website I am talking about posts new information at midnight, 9 am, 12 pm, 3 pm, 6 pm and so on every day and I have a bunch of users that log into it pretty much at the same time. So I want to have a copy of the website ready in the cache before automatically.
I've heard there is a way to download all content from a site in linux using wget -m but I have never tried it.