I decided to fix this problem by writing Cache Remote File, a PHP script that performs three functions:
The script will give up trying to load the remote file after three seconds, which keeps it from hanging when the remote server is having difficulties. It can be customized to load any URL of any content type and requires PHP 4 or higher with cURL support. I've released it under the GPL. Let me know if you have any problems with it or can improve the script.
Reporting on Cache Remote File, a PHP script is well within your rights.
It is also sleazeball shit.
It took me a couple of hours to get the meaning of your comment. I'm slipping.
Nice algorithm, but I would have made it a function to which you could pass the URL and local filename, so you could easily cache more than one such page. I also wouldn't echo the result, I'd just return it and let the caller echo it if desired.
echo get_cached_remote_page($url, $localfile);