Quote:
Originally posted by rockkeys Finding 20 or 30 files on a search, of which only 2 or 3 have multiple sources is rediculous, since many of the common sources on p2p networks are misconfigured, or have routing problems that result in thruputs of a couple of hundred characters per second. |
A low throughput can mean that a remote servent is misconfigured. But it can also mean, that this as fast as the connection allows it because there are parallel uploads going on.
Quote:
The large maximum is needed because some file groupings really do have hundreds of files associated with them. For example, the Anime video series Inu Yasha has 125 episodes, and there are multiple versions, both dubbed and undubbed out there. The only search term you can count on is the title, because of the way users list their shared files. |
128 Inuyasha episodes are available via different p2p networks, AFAIK, - but you won't be able to find & download many of them with Gnutella.
Quote:
Limiting a search return to 20 or even 50 files would prevent the user from ever finding the complete set of files. |
That's why the limit is at ~150 results.
Quote:
[...]
LimeWire, in specific the beta currently available, has no ability to retry a search automatically, and only seems to allow one additional re-search before clearing the search and starting over.
Yet after leaving the program running for a day, and then searching, I was unable to find more than 27 files in my catagory. |
LimeWire's lack of automatic research is intended. Finding additional sources via keyword search is very inefficient that's what the download mesh is for. Once your download has started, you will be informed about other sources for this particular file automatically.
Quote:
In the same period of time, gtk-gnutella found 2789 files, with literally thousands of sources between them. And Kazaa-lite found almost 200 sources for each of the main files of interest, and in some cases nearer to 300 sources! |
KazaaLite and gtk-gnutella repeat searches automatically. They can do that because they have a smaller market share on their respective networks. If LimeWire were to do the same, the performance of the Gnutella network would deteriorate quickly.
All those additional query packets would have to be routed throughout the network not only causing huge redundancies but also overloading the ultrapeers that have to route queries.
We had that a while ago and it sucked. Too many queries were overloading the network so instead of finding more sources you were actually finding less because queries had to be dropped before reaching very far.