MJ Ray markj@cloaked.freeserve.co.uk writes:
Richard Kettlewell rjk@terraraq.org.uk wrote:
the biggest danger spots for pipelining, as sometimes authentication involves handing off the connection to another process for a bit.
Surely that delay must not drop any data from the socket, else the server is in error. I've done some POP3 pipelining and the only problem has been some non-compliant servers that try to rely on an alternating dialogue.
It's nothing to do with delays. The problem arises when something on the server side uses stdio or some other buffered I/O library. Sendmail and INN are real examples of this.
[...]
If you just want to download a single file I'm not sure I see why one would look at robots.txt.
If you're doing it automatically, you're supposed to, IIRC.
I must have missed the bit where we said we were only talking about automated requests. (I bet the majority of downloads are manual...)
The headers probably fit in a single packet and don't require any back-and-forth between the client and the server, so aren't going to contribute much to the time taken.
I'm not so sure. Headers seem to be getting more and more verbose. Are they just adding things back in to emulate FTP's richness?
The headers on my home page come to under 500 bytes (including three lines of rubbish from proxies).