Simple cron script to copy remote webpage locally?

Daan Vreeken [PA4DAN] Danovitsch at Vitsch.net
Sun Jul 27 09:24:52 PDT 2003


On Sunday 27 July 2003 16:51, Dragoncrest wrote:
> I've got a webpage that updates dynamically on one of our servers and
> lists a bunch of statistics about spam and such on our servers.  Problem
> is, the script puts a load on the server if too many people access it
> and it eventually kills the server.  I would like to lower the traffic
> on this server by setting up a script on a remote server that is
> activated every 10 minutes by cron and automatically loads the remote
> script then copies the results to a local file on the new public server
> which people can then view at their leasure without killing our stats
> server.  What is going to be the easiest way to do this?  I'm sure there
> has to be a simple way to do this, but I'm kinda drawing a blank on how.
>  Can anyone help?

Take a look at the "fetch" program
basically you just need to supply it a URL and a local file-name.

# man fetch

grtz,
Daan


More information about the freebsd-questions mailing list