Downloading in chunks will not hurt performance significantly as long as the chunk size is reasonable and both-ends support Keep-Alive and HTTP pipelining.
Even if the other-end does not support Keep-Alive nor HTTP pipelining, download performance shouldn't hurt much as long as the downloader is using reasonable chunk sizes. If a client doesn't have support for at least Keep-Alive, it definitely should not download files in chunks. This is especially important when downloading files from firewalled hosts, establishing the connection through PUSH have relatively low success rate, so it is important to not disconnect from the host.
I wouldn't call a fixed 97.7k chunk size reasonable. It results way too many re-requests when downloading large files. Also, if the client doesn't support Keep-Alive (you said it loses the connection after downloading a chunk), it will definitely result very poor download performance, even if there are a lot of reliable sources available.
Keep-Alive is essential feature when downloading files in chunks, HTTP pipelining is highly recommended.
Last edited by Kaapeli; December 1st, 2002 at 01:03 PM.
|