MJ Ray markj@cloaked.freeserve.co.uk writes:
Richard Kettlewell rjk@terraraq.org.uk wrote:
MJ Ray markj@cloaked.freeserve.co.uk writes:
If you just want to download a single file I'm not sure I see why one would look at robots.txt.
If you're doing it automatically, you're supposed to, IIRC.
I must have missed the bit where we said we were only talking about automated requests. (I bet the majority of downloads are manual...)
I must have missed the bit where I didn't say "possible" for the download of robots.txt. Whether it's one file or many is irrelevant. Whether it's manual or automatic is relevant. That's what I said.
Why so defensive? If we're discussing the relative performance of FTP and HTTP then concentrating on unusual usage cases tells you nothing of any use; rather, one must look at the mainstream.
Given that the OP didn't specify what they were trying to do in any particular detail, that's interactive, non-automatic download, largely using the "Save link as" option in a web browser, and probably interactively using some command line tool in a small set of cases.
The number of people doing automatic downloads, for instance mirroring, will be tiny in comparison.