Quote:
Originally posted by sdsalsero I understand the need to reduce bandwidth waste but, really, I hope there is a way to improve the requery. For instance, maybe maintain a copy of the original query and then requery along with a size or file-hash? That way, the only hits would be the same file. |
Statistically requeries always were a complete waste of bandwidth. 95% did not even return any results. Query-by-hash is the most inefficient thing you can do in gnutella at the moment. It means sending a query to thousands of peers with a very, very low probability that anybody has this particular file. If more people had that file, you would get those as alternate locations while downloading.
But it's not just that requeries are inefficient, they were abused by other vendors. The network load was unacceptable so they had to be removed.
Quote:
As for blocking requeries, are they labeled as such? They must be else the Ultrapeers couldn't differentiate and block them. |
LimeWire's requeries are labeled as such. But there are a couple of other ways to identify requeries, because they often contain the full filename or the hash of a file.
Quote:
I've been a paying support of LW since day 1 (i'm on my 3rd subscription now) but this loss of functionality is really really pissing me off... |
There is no other way, if some people can't play by the rules. There are clients out there, that send requeries every five seconds. And if you want to ban their requeries, you have to ban all requeries.