Baby? Bathwater?!
Trap_jaw, I know you're not one of the developers but please post the following suggestions on the developer mailing-list:
1. The ability to continue searching automatically for files is a BASIC feature of any P2P app. If they're unable to design an intelligent requery function, they're going to lose their audience.
2. For every file left in requery mode, keep a record of every non-firewalled IP that has ever been reported with the file and periodically request the file anew from those IPs. This will not waste any Gnutella bandwidth since it's a direct request! Also, allow the enduser to right-click each file and edit the IP list, i.e., if you've been manually keeping a list. No, I don't think that the existence of dynamic IPs invalidates this function!
3. Keep track of leaf-node's requeries and throttle them. This could become part of the G2 protocol, e.g. "Do not requery more often than every 120 seconds."
4. If you're seeing a lot of unsuccessful requeries, why not try to improve them? If you assume that requeries are a required function, then making them more effective will reduce bandwidth. Requerying for the original search-term plus the filesize would be more likely to 'hit' than searching for the exact filename. Again, this could become part of the G2 protocol, e.g. "no 'filename' over 20 char and no filehashes in requeries."
5. Don't throttle or block manual requeries! (right-click on search tab, select Repeat Search)
(Thank you...)
Last edited by sdsalsero; April 22nd, 2003 at 08:53 AM.
|