I created a shell script to invoke the above mentioned perl script.
This shell script ensure that we always have a valid cloud map.
if [ -s clouds.jpg ]
RESULT=`file clouds.jpg | grep -c "JPEG image data"`
if [ $RESULT -eq 1 ]
mv clouds.jpg clouds_2048.jpg
It invokes the perl script, then makes sure that the resulting file is non-zero in size.
The script then verifies that the file is a JPEG image, as opposed to an HTML file
as might be returned in a 404 error. If all is OK, it moves the fetched over
over the stored file and finishes.
The cronjob I use to invoke the shell script is:
15 */3 * * * dan /home/dan/bin/clouds.sh
This will run at 15 past the hour, every three hours. However, why run it that often? For example,
if you're doing this on your office system, why not restrict updates to something that resembles your
working hours? Something more like this:
15 7-18/3 * * 1-5 /home/dlangille/bin/clouds.sh > /dev/null
- 15 - run at 15 minutes after the hour
- 7-18 will run it only between 7AM and 6PM. And then only every 3 hours. ie. 7AM, 10AM, 1PM, etc.
- 1-5 will run the job only on Monday-Friday
Why restrict? To reduce the bandwidth consumed. This helps out both yourself (or your office) and the
provider of the images. Every bit helps.
Once it has run sucessfully, you can get rid of the output that is mailed to you by adding the following to the end of the line:
2>&1; > /dev/null
That should be everything you need to have a much more interesting background. Enjoy.