Gnutella Forums

Gnutella Forums (https://www.gnutellaforums.com/)
-   General Gnutella Development Discussion (https://www.gnutellaforums.com/general-gnutella-development-discussion/)
-   -   Searching the Smart way (https://www.gnutellaforums.com/general-gnutella-development-discussion/6806-searching-smart-way.html)

Stgieide December 31st, 2001 11:54 AM

In practice, the best way to manage this is to have two hostlists. One that you receive queries from (as today) and one that you send searchs to.
This new list should always become better as you drop those that do not respond to your searches and/or som automatic test-searches performed by the client, based on your shared directory.

As for the freeloaders, noone would want to use them on the list over hosts to send searches to. But that is good for them, less traffic over their modems.

Moak December 31st, 2001 03:54 PM

Quote:

Originally posted by Stigeide
This is obvious! ;)
Yep, it's obvious that you will segment the network and get cut off from the files you want (you do not get more files, propably you get none). If you destroy members of a chain the whole chain will be destroyed.
IMHO you are treating the symptoms, perhaps it is better to encourage sharing as suggest in other threads.

cultiv8r December 31st, 2001 04:32 PM

Keeping to your example (TTL of 4, 3 connected hosts at each node), you gave us this:

Quote:

3**4 + 3**3 + 3**2 + 3**1 = 81 + 27 + 9 + 3 = 120
And that's is absolutely correct. You're also correct that if 70% out of 120 were freeloaders, you'd only have a mere 36 nodes that could give you a possible response.

BUT, that is only in the current scenario, where you send a query to each connected node, regardless it is a freeloader or not. Under the scenario you are proposing, to refrain sending a query to a node known as a freeloader, you will end up with a different number.

If 70% of the 3 connected nodes would be a freeloader (2.1 ~ 2), then you'll end up with 1 (0.9) non-freeloader per 3 nodes. So:

1^4 + 1^3 + 1^2 + 1^1 = 10.

10 possible nodes, in comparision to 36 possible nodes is a drastic reduction in my opinion. In both your and my case, we're also assuming an even spread of freeloaders, which is obviously never the case. What if 3 out of 3 connected nodes turn out to be freeloaders - you'd not be sending out *any* queries to anyone.

However, I can agree that you could have a preference for nodes that seem to return more results on average, although you should never refrain from sending a Query message.

-- Mike

-- Mike

Stigeide January 1st, 2002 07:33 AM

Anyways, thanks for your input.

Stig Eide

Pallando January 1st, 2002 09:48 AM

Every Input is welcome! :)

cultiv8r January 1st, 2002 10:53 AM

Just one more thing though. A TTL of 120 will not survive long, as most clients will drop messages after 7~10 hops.

-- Mike

blb January 1st, 2002 08:24 PM

Also, say you are connected to three other nodes, and they are all freeloaders. Where does the search go now?

hermaf January 2nd, 2002 08:43 AM

My measurements have shown that even if you use a TTL greater than 7 you won't get back packages with a TTL > 7 with very high probability ( I talk about 99.9% I haven't calculated that number exactly yet). So you horizon today is "aways" 7 hops in the gntella network.

So cutting out the freeloaders from your searches will give you less hits. Raising the TTL won't help. But as you might know there are other ways how to to priorotize a search/hit ... like eDonkey does.


@blb: Searches contain a TTL time, which is a number of how often the search is forwarded in the network. So the freeloaders will forward the Search if TTL > 1 ...

Stigeide January 2nd, 2002 01:16 PM

First, I know that a request with TTL=120 will (thank God) not survive - it was just to make a comparison between the old, inefficient method and my Smart method. It is easier to compare the efficiency of two methods if they consumes the same amount of bandwidth.

Anyways, you have to admit that the current method of sending searches blindly is inefficient?

My (Smart ;) ) method would use two lists of hosts for each client:
One list to send and forward queries to.
This list should be cultiv8ted by hosts that responds to your searches. This way you have the great benefit of being close to hosts that hosts files that you want.
One list to receive queries from
This list, you should not care who is on. But you know that these hosts prefer your files, if they are using the Smart method.

My claim is, that this method will make the searches much more efficient because:
Searches is only send to those who actually have files.
The probability that the hosts that see your search will return a hit is bigger, because they are closer to you in "taste".

You can think of it as insiders and outsiders. The outsiders are freeloaders and sends the requests to the insiders. The insiders sends the requests to other insiders.
A cute picture:
http://www.geocities.com/stigeide/s.html

Peace!
Stig Eide

Moak January 2nd, 2002 01:50 PM

Quote:

Originally posted by Stigeide
http://www.geocities.com/stigeide/s.html
Looks like superpeer concept :)


All times are GMT -7. The time now is 05:24 PM.

Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2025, vBulletin Solutions, Inc.
SEO by vBSEO 3.6.0 ©2011, Crawlability, Inc.

Copyright © 2020 Gnutella Forums.
All Rights Reserved.