Here in Pakistan, For home users downloading/uploading speed is not comparable (very less) to the one we get on our LIVE hosting servers.
I had to install a PHP opensource solution LIVE, traditional process is that we download the archive on our PC from website, and then upload on our Server, which is definitely very much time consuming (downloading/uploading time) if your archive is 100MB+.
So, I wrote a little PHP script, which when executed on server, It will fetch the downloadable file content from source server and store locally on destination server (local). It is equivalent to the wget command we have in linux, if your hosting allow SSH access. If it is not like in my case … you can use wget.php wrapper. This can also be used to backup one server files to another.
Here is the sample code for file "wget.php"
$url = $_GET["url"];
$f = explode("/",$url);
$f = $f[count($f)-1];
$ch = curl_init();
$timeout = 5;
$data = curl_exec($ch);
$d = get_contents($url);
echo $f ." downloaded";