Hi folks,
I am having hassle downloading a 700MB ISO image. I tried last night and the download finished after retrieving only 490MB saying "Download Complete". I tried again this morning and it finished after retrieving only 331MB, again reporting "Download Complete". Firstly I am confused as to why both downloads finished prematurely without giving any error messages, but, secondly, I was wondering if there is a better way anyone could recommend to download this large file rather than simply clicking on it on the web page and letting Mozilla automatically handle the ftp download process as I do at present?
Are there any tips and techniques more experienced ALUGers would like to share to help me download this large file successfully?
Thanks,
Ian.
PS: Background info in case it is relevant: My PC is running Debian 3.0 with over 5GB free disk space and I am downloading over a dedicated 128kB NTL broadband link.
Ian Douglas wrote:
Are there any tips and techniques more experienced ALUGers would like to share to help me download this large file successfully?
wget ftp://foo.bar.example/mylarge.iso and if the connection drops you can resume by using wget -c
you could also try ncftp which has support for resuming downloads, mozilla probably isnt the greatest application to download very large files in due to its tendency to crash at just the wrong moment...
Adam
On Fri, Nov 22, 2002 at 06:21:39PM +0000, Adam Bower typed the following...
Are there any tips and techniques more experienced ALUGers would like to share to help me download this large file successfully?
wget ftp://foo.bar.example/mylarge.iso and if the connection drops you can resume by using wget -c
you could also try ncftp which has support for resuming downloads, mozilla probably isnt the greatest application to download very large files in due to its tendency to crash at just the wrong moment...
...and that's all Adam Bower wrote I'm afraid
Yup, I use both those and can recommend them. I also occaisionally use Downloader for X:
http://www.krasu.ru/soft/chuchelo/
Which is a pretty handy GUI download manager.
On Friday, November 22, 2002 6:21 PM, Adam Bower wrote:
Ian Douglas wrote:
Are there any tips and techniques more experienced ALUGers would like to share to help me download this large file successfully?
wget ftp://foo.bar.example/mylarge.iso and if the connection drops you can resume by using wget -c
Thanks Adam, and indeed to everyone else who pointed out this command. I guess simple commands like this are elementary to most of you but I was very pleased to discover it and notice that it was already installed on my system. Also, thankfully, it's command line is simple enough even for an idiot like me to understand, so I gave it a go and it worked perfectly, resuming a previously crashed download without problems.
Thanks again to everyone for your help,
Ian.
One word: ProZilla (http://prozilla.genesys.ro/). I've been using this for absolutely ages and not only does it improve the time it takes to download large files, it also supports resumable downloads in case things bugger up. Supports FTP and HTTP transfers.
Regards,
Martyn
-- Martyn Drake | Email : martyn-d@moving-picture.com Systems Administrator | Web : http://www.moving-picture.com The Moving Picture Company | Phone : +44 (0)20 7494 7853
-----Original Message----- From: main-admin@lists.alug.org.uk [mailto:main-admin@lists.alug.org.uk]On Behalf Of Ian Douglas Sent: 22 November 2002 18:15 To: ALUG List Subject: [Alug] What is the best way to perform large downloads?
Hi folks,
I am having hassle downloading a 700MB ISO image. I tried last night and the download finished after retrieving only 490MB saying "Download Complete". I tried again this morning and it finished after retrieving only 331MB, again reporting "Download Complete". Firstly I am confused as to why both downloads finished prematurely without giving any error messages, but, secondly, I was wondering if there is a better way anyone could recommend to download this large file rather than simply clicking on it on the web page and letting Mozilla automatically handle the ftp download process as I do at present?
Are there any tips and techniques more experienced ALUGers would like to share to help me download this large file successfully?
Thanks,
Ian.
PS: Background info in case it is relevant: My PC is running Debian 3.0 with over 5GB free disk space and I am downloading over a dedicated 128kB NTL broadband link.
_______________________________________________ main@lists.alug.org.uk http://www.alug.org.uk/ http://lists.alug.org.uk/mailman/listinfo/main Unsubscribe? See message headers or the web site above!
The command-line FTP client has a 'reget' (or is it 'rstart' or 'resume', try RTFM, my memory gives out ;-( ) which will restart any fetch. However, as has been mentioned, 'wget -c' will also do it.
On 22-Nov-2002 Ian Douglas wrote:
Hi folks,
I am having hassle downloading a 700MB ISO image. I tried last night and the download finished after retrieving only 490MB saying "Download Complete". I tried again this morning and it finished after retrieving only 331MB, again reporting "Download Complete". Firstly I am confused as to why both downloads finished prematurely without giving any error messages, but, secondly, I was wondering if there is a better way anyone could recommend to download this large file rather than simply clicking on it on the web page and letting Mozilla automatically handle the ftp download process as I do at present?
Are there any tips and techniques more experienced ALUGers would like to share to help me download this large file successfully?
Thanks,
Ian.
PS: Background info in case it is relevant: My PC is running Debian 3.0 with over 5GB free disk space and I am downloading over a dedicated 128kB NTL broadband link.
main@lists.alug.org.uk http://www.alug.org.uk/ http://lists.alug.org.uk/mailman/listinfo/main Unsubscribe? See message headers or the web site above!
On Sunday, November 24, 2002 7:57 PM, Raphael Mankin wrote:
The command-line FTP client has a 'reget' option which will restart any fetch. However, as has been mentioned, 'wget -c' will also do it.
Thanks for this additional tip about the FTP command Raphael.
Ian.