wget starts downloading then stops "cannot write to"
I'm using wget to mirror some files across from one server to another. I'm using the following command:
wget -x -N -i http://domain.com/filelist.txt
-x = Because I want to keep the directory structure
-N = Timestamping to only get new files
-i = To download a list of files from an external file, one on each line.
Small files such as one i'm testing that's 326kb big download just fine.
But another that is 5gb only downloads 203mb and then stops (it is always 203mb give or take a few kilobytes)
The error message shown is:
Cannot write to âpath/to/file.zipâ
(I'm not sure why there are the strange characters before and after. I am using Putty in Windows and this may or may not have something to do with it, so I left them in. I presume not though.).
The full response is as follows: (I have replaced paths, ip and domain name)
--2012-08-31 12:41:19-- http://domain.com/filelist.txt Resolving domain.com... MY_IP Connecting to domain.com|MY_IP|:80... connected. HTTP request sent, awaiting response... 200 OK Length: 161 [text/plain] Server file no newer than local file âdomain.com/filelist.txtâ
--2012-08-31 12:41:19-- http://domain.com/path/to/file.zip Connecting to domain.com|MY_IP|:80... connected. HTTP request sent, awaiting response... 200 OK Length: 5502192869 (5.1G) [application/zip] The sizes do not match (local 213004288) -- retrieving.
--2012-08-31 12:41:19-- http://domain.com/path/to/file.zip Connecting to domain.com|MY_IP|:80... connected. HTTP request sent, awaiting response... 200 OK Length: 5502192869 (5.1G) [application/zip] Saving to: âdomain.com/path/to/file.zipâ
3% [====>
] 213,003,412 8.74M/s in 24sCannot write to âdomain.com/path/to/file.zipâ
It doesn't seem to make any difference if the path directory already exists or is created on the fly.
Does anyone have any idea why it stopping and how I can fix it?
Any help with be most appreciated.
EDIT: I have also tried just doing a wget, no file input and renaming the file. This time it downloads a little over 3gb and then gives the same cannot write error.
wget -x -N http://domain.com/path/to/file.zip -O files/bigfile.zip
Solution 1:
You will get this error if you are out of disk space. run df and you will see if the directory you're writing to is at 100%
Solution 2:
It is a problem with long URL. I faced it too. So, I used bit.ly and shortened the url. Works like a Charm!