I wish I knew. All these ideas sound reasonable if they would work in practise. The major developers, like LimeWire and BearShare, who have the tools to measure the impact on the network, are trying either workarounds or new ideas. You can follow the discussions about how the gnutella protocol is being constantly revised by reading the daily (or archived) posts on
http://groups.yahoo.com/group/the_gdf/
btw--I like the idea of watching when the machine has had no mouse activity for a few hours, and then doing a "repeat search" for any pending downloads.
Currently, though, if enough sources have been found for a file, LW will continue trying them until all are tried. This usually means that a file left unattendended overnight will complete without babysitting. The key is to repeat search a few times to build up the alternate locations, then leave it alone. In practise, this works as well as/better than requeries. Every time you download a chunk of a file, the host is also supposed to send a list of the alternate locations it knows too.
cheers