"Keith Watson" Keith.Watson@Kewill.com writes:
http://slashdot.org/article.pl?sid=03/02/13/2132221
An anonymous coward asks "Looking to serve files for downloading (typically 1MB-6MB), I'm confused about whether I should provide an FTP server instead of / as well as HTTP. According to a rapid Google search, the experts say 1) HTTP is slower and less reliable than FTP and 2) HTTP is amateur and will make you look a wimp. But a) FTP is full of security holes. and b) FTP is a crumbling legacy protocol and will make you look a dinosaur. Surely some contradiction... Should I make the effort to implement FTP or take desperate steps to avoid it?"
So what's the general opinion about FTP vs. HTTP in the GNU/Linux environment?
FTP is certainly not inherently faster or less reliable than HTTP; for large files I would expect it to turn out about the same, as most of the time both are just shifting raw data straight from a file on disk to a TCP connection.
For small files FTP has the overhead of creating a new TCP connection for every individual file transferred. HTTP used to have this too, but modern versions don't necessarily suffer this limitation.
Furthermore, FTP has the additional overhead of setting up the control connection and loggin in, which could quite plausibly double the number of round trips required to fetch a single small file.
"HTTP is amateur" doesn't seem to mean anything at all. I might as well say "FTP is evil and must die" (which is indeed my opinion but is probably not a very convincing argument for anyone who doesn't share it).
FTP is not inherently full of security holes, but many implementations have had security problems in the past. But then, exactly the same is true of HTTP. I don't think there's any good motive for choosing one over the other here.