MJ Ray markj@cloaked.freeserve.co.uk writes:
Richard Kettlewell rjk@terraraq.org.uk wrote:
Furthermore, FTP has the additional overhead of setting up the control connection and loggin in, which could quite plausibly double the number of round trips required to fetch a single small file.
I'm not sure this part holds: nothing requires you to wait for an answer to the login before sending the commands, although it is "intended to be an alternating dialogue". If the login fails, you've just given the server some garbage, but hey, you're not a human who mistyped, you're a dumb downloader program who will now fail anyway. Therefore you can pipeline commands
You can probably pipeline to an extent, yes - but see the difficulties with pipelining in SMTP and NNTP. Authentication strikes me as one of the biggest danger spots for pipelining, as sometimes authentication involves handing off the connection to another process for a bit.
In passive mode (quite popular for clients behind firewalls) you have to wait for the response to the PASV command before you can create the data connection.
and compared with HTTP headers and possible download of robots.txt in each direction, it's positively minimalist.
If you just want to download a single file I'm not sure I see why one would look at robots.txt. If you want many files then I'd expect the cost of the control connection to rapidly beat a robots.txt lookup.
The headers probably fit in a single packet and don't require any back-and-forth between the client and the server, so aren't going to contribute much to the time taken.
Perhaps we should stop speculating and start profiling l-)