The command-line FTP client has a 'reget' (or is it 'rstart' or 'resume', try RTFM, my memory gives out ;-( ) which will restart any fetch. However, as has been mentioned, 'wget -c' will also do it.
On 22-Nov-2002 Ian Douglas wrote:
Hi folks,
I am having hassle downloading a 700MB ISO image. I tried last night and the download finished after retrieving only 490MB saying "Download Complete". I tried again this morning and it finished after retrieving only 331MB, again reporting "Download Complete". Firstly I am confused as to why both downloads finished prematurely without giving any error messages, but, secondly, I was wondering if there is a better way anyone could recommend to download this large file rather than simply clicking on it on the web page and letting Mozilla automatically handle the ftp download process as I do at present?
Are there any tips and techniques more experienced ALUGers would like to share to help me download this large file successfully?
Thanks,
Ian.
PS: Background info in case it is relevant: My PC is running Debian 3.0 with over 5GB free disk space and I am downloading over a dedicated 128kB NTL broadband link.
main@lists.alug.org.uk http://www.alug.org.uk/ http://lists.alug.org.uk/mailman/listinfo/main Unsubscribe? See message headers or the web site above!