|
Register | FAQ | The Twelve Commandments | Members List | Calendar | Arcade | Find the Best VPN | Today's Posts | Search |
General Gnutella Development Discussion For general discussion about Gnutella development. |
| LinkBack | Thread Tools | Display Modes |
| |||
Re: More about a Gnutella Swarming idea Quote:
Such a passive sniffing (or call it passive searching) does cost no extra traffic and should be allready inplemented in a modern client (to improve automatic requeries in a healthy way). Yes, the results will be falsifyed by query caches... but passive search will help, especially when users do not share enough files to create a good upload statistics. |
| |||
NOOOOOOO!!! Moak, I am learning how to be a developer. I don't always have a lot of things to contribute to discussions since I am such a newbie... I learn a lot by reading your ideas, opinions, and questions. I will be sad to see you go |
| ||||
Thx thx Aigamisou Meanwhile we have established a new Developers Forum (this one)... I hope we can attract even more developers, geeks, geekgirls, network gurus, friendly moderators, people with ideas and find a coders community for Gnutella's future.... and I'm posting more again. |
| |||
One idea to swarming: why do you need to find out which files are popular and download them "in the background"? You would cause even more traffic... The only thing you need is to make a cache, size depending on connection speed. If an ADSL user has a 2G cache file, it should be OK for him/her nowdays... If you download something it goes to the cache, then it will be copied in the download folder. If your download exceeds the 2G limit, the files that where most uploaded from you will stay in the cache, the others will be replaced with the newly downloaded ones - so no freeloaders. And for the modem user problem: those are not the only ones, who "steal" bandwidth. 56/36k modem user is able to upload at 4k/s, download at 8k/s (approx.). ADSL/Cable 384/64 or 512/128 - the difference is bigger, so they (me too) are getting even more than they give.... Multicasting? Wouldn't it be able to do? |
| |||
A note for those not aware, Freenet shares through a cache that contains files requested on the network. The cache is encrypted so you do not know what is being shared. <http://freenetproject.org/> |
| |||
I agree with 'Nobody' - I think it sux to use bandwidth downloading files that I personally won't ever use. But I think it's a great idea to have a cache of files I have previously downloaded which are deemed 'useful' by the gnutella network, however it decides that. I have a question though - if I search for 'brinty spears' what is going to stop me getting individual results for all the sub-parts of that one giant popular brinty spears mpeg, when all I want is her latest mp3? (OK, bad example, but do you get what I mean?) Basically, it does come down to a protocol change, I think, even though a very small one. What about a user of an old version searching? If you just use the current protocol and put the filtering at the client-level, then old clients will be swamped with search results for partial files. You will be creating, say 100 files in the place of every 1 file now, and then distributing them say 100 times as much, so in fact 10000 query hits suddenly where before it would have been one. I don't want that! Also the point about caching of files with offensive content is very valid. This is why freenet works the way freenet does, because no-one can hope to create a filter that will filter all files they find offensive, and many people find it morally objectionable to knowingly host files they find objectionable .. so freenet encrypts everything so that you can't know what you are hosting, allowing you to say, ok, either i play or i don't but at least i will never know what horrors i am distributing .. therefor the culpability is on the people sharing those files. This moral benefit is not a clear benefit to all users, but the legal benefit certainly is. If you can't know what is being stored, in most countries currently that means you aren't responsible for it, and secondly, noone can point the finger of blame because they can't decode the contents unless they know what it is anyway (in which case they could well be the people who caused the file to be stored on your system - it's a clever protcol!). Unless you can answer these questions, 'real swarming' should be very optional, and not enabled by default. In this case, how many people will enable it? Perhaps you might want to set up clients so that people can only download swarmed files if they also host them, to encourage people to turn it on. I dunno. |
| |
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Slow Download Speed, Yet High Speed Internet | eclipse838 | Download/Upload Problems | 1 | August 1st, 2006 09:25 AM |
Question about download speed; used to higher speed on other apps. | AlexNR | Download/Upload Problems | 0 | February 12th, 2006 08:56 PM |
Why does it say Downloading from 2 host but the speed is at 0 kb\s? | WeZeL | Download/Upload Problems | 1 | October 1st, 2003 08:27 PM |
high speed modem, slow speed download | Unregistered | Connection Problems | 1 | May 30th, 2002 07:07 AM |
My Host Speed | peguido | General Discussion | 2 | July 3rd, 2001 04:01 PM |