|
Register | FAQ | The Twelve Commandments | Members List | Calendar | Arcade | Find the Best VPN | Today's Posts | Search |
General Gnutella Development Discussion For general discussion about Gnutella development. |
| LinkBack | Thread Tools | Display Modes |
| |||
Improve Download Host/Speed Download Cache As the Gnutella network grows the amount of available clients to download files from decreases there need to be a built in system to compensate for this. Besides the fact that everone should share there files. The solution that I see is that of an upload cache on each computer that will store a certain amount of hard drive space that can be allocated to store the most used files on the network. The Gunetella client monitors the push requests coming past the computer to figure out what files are the most requested. When the user of the computer goes idle and the computer processing and bandwidth are not being taxed the client can search for the most requested file. The client could reach out and download the file. Once the file has been downloaded into the cache that file could then be shared. If this system was to be implemented then there would have to be safety algorithms to protect the client, and bandwidth of the user. The first problem that would have to be addressed is the problem of all the clients on the network downloading most requested file on the network while ignoring the second third fourth and fifth most downloaded files and so on. This could be solved by when the client send out a search request to find clients that contain the file to download from, the client can count the number of clients that already hold that file that is searching for. If the number is deemed large then that file would be skipped and it would move on to the next file. The second way to combat this problem would be that when an automatic cache download is requested that it only request it directly instead of through a push request. This would keep the other clients on the network from would confuse the automatic cache download instead of request as an actual user downloading a file. I think the other value that must be calculated is the maximum single file download size should be controlled by the size of the bandwidth. This would protect the modem user from downloading a 100 megabyte mpeg. The maximum file size that should be downloaded is by the available bandwidth subtracted by the bandwidth that Gnutella network uses to keep the client function properly on the network. This bandwidth should be multiplied by approximately 10 minutes. That way if the user returns and the file is downloading or uploading it would not monopolize the user bandwidth for an extended period of time but would always be within 10 minutes of being done(10 minutes is a arbitrary time, it may need to be increased or decreased to find optimum performance.) This statement is true unless you have a low speed connection downloading from a high speed connection. Under these circumstances it would not be unobtrusive due to the fact the user still has bandwidth available. There are already is maximum upload speed built into most clients (This should be suspended if the computer is idle.) The next problem to be implemented is that of cache filters. There are many people that would be offended if there computer hosted adult material for other people. There would have to be an adult filter on due to this. Also the user could state if he did not want to store programs or music on his computer. Maybe a keyword filter could be added to for more flexibility. |
| ||||
swarmimg yeah, swarming is a very good idea! It could increase the download speed for all of us and also make an end of freeloading. Just in case you want to see an old posting (see bottom of first page): http://www.gnutellaforums.com/showth...light=swarming |
| |||
Swarming is a good idea, but correct me if I am wrong it can not work on the current network procalls. if it can how would it be implimted. I am specificly would like to know how client would relate thousands of smaller files and peice them togther to one larger file. Also what would happen if users are offline that have crucial peices of the file. Would the download fail or would it have to wait maby an extended amount of time till a client logs in that has the file and is not busy? Swarming sounds good for smaller size files but puts a practical limit on the size of the file this could work with. I am realy intrested in this could you please elbrate some more on your idea. |
| ||||
More about a Gnutella Swarming idea Hmm yes, the current Gnutella protocoll would be basically okay for swarming. Suprising, isn't it? Some smaller changes I guess, but all compatible to older clients, most ideas are allready discussed somewhere or proposed by other users. A new client must provide extra logic, to maintain a pool of swarming parts: Finding most requested files is the first item: Every client could maintain a statistics of highly uploaded files (files often downloaded by other users from itself) and tells other clients which they are (e.g. once on connect). I think we should not use search queries to maintain these statistics, because they are too inaccurate and upcoming query caches or super peers (which I both highly recommend) would falsify those results. Every client can now calculate which files are highly requested (within the current horizon) and tries to download a random part from a random client. Then adds this part into a swarming pool. This pool could be refreshed time by time and should not grow over a specific size (e.g. some MBs on harddisk). Finding matching partials (I call the small parts of a file "partials") could be easy solved: Just run a normal search. - Okay, we should add an improvement here: As far as we know from multiple source resuming servants (e.g Xolox), there is a problem with not matching partials. It happens that downloaded partials do NOT match to each other and a lot of bandwith is wasted by downloading partials which are not from the same file (Xolox and that 80%-90%-99%-50% problem). Ooops, to avoid this... I would highly recommend to add hashs to any gnutella traffic that is file-related (search and download). A 'hash' is a unique identifyer (or call it a kind of checksum) for a file within a typical horizon. So indexing files and exchange hashs could be a clue to improve "automatic researches" which indeed is a "must-have" for paralled or multihosted downloads. Why? Once you have downloaded 25% of a file called "Shritney Pears.doc" and the host disappears, you need to download the remaining somewhere else. Automatic researches for "Pears" can help, but only if you use a unique hash, you make sure that results match.... before even downloading them. Downloading partials is an allready solved item: Right now only Xolox provides parallel downloads from multiple peers (fast!) and as another example all FastTrack clients do (Morpheus/Kazaa/Grokster)... but wait a while, more will come for sure! Parallel or segmented downloads of one file is a "must-have" for swarming, no protocoll change at all needed. As an advantage from swarming I see especially making low bandwith user (modem user) to be a valueable resource! No more free loading and higher bandwith for all. Hope it helps, Moak PS: Another cool feature to improve downloads could be "specialized gnutella horizons"... if you're interested read this: http://www.gnutellaforums.com/showth...p?postid=13760 Last edited by Moak; November 9th, 2001 at 10:17 PM. |
| |||
Re: More about a Gnutella Swarming idea Quote:
I think the worst thing about the network is the traffic needed to stay connected (pings/pongs), and the traffic that all queries generate; this is where all efforts should be made. A modem user will use most of his/her bandwitdh just to stay connected, leaving almost no bandwith for downloads/uploads. A new connection schema may be required to fix this, but then it is difficult to keep compatibility with existing clients. With lower requirements in bandwith for connection you can download/upload faster, and only then you can start to think about swarming d/l or anything else. |
| |||
Quote:
However when I connect from home with modem to Gnutella, most of the bandwith is used even I'm not not downloading/uploading, making almost impossible to use Gnutella with modem; I'm sharing files but I don't think anyone will be able to download much at 2 Kb/s or less... With ADSL is different, thought. Quote:
Quote:
|
| ||||
Yeah I totaly agree, swarming is far far future. Before swarming other concepts for reducing traffic are more important IMHO: query chaches, super peers (!) and a substitution for ineffective ping/pongs. This stuff will hopefully dramatically reduce backbone traffic, and e.g. modem users can download with full speed. Another interesting approach to reduce traffic was not mentioned here so far: Host caches or super peers with a regional toplogy and multicast, to reduce ISP traffic. While gnutella backbone causes huge traffic for ISPs, traffic should not be reduced inside gnutella network only but also for the underlying physical network. Not reducing this huge ISP traffic, means slower network performance and increasing costs for ISPs. This could mean for the enduser (us) increased internet costs or more expensive flat rates. About your question for caching/swarming most requested files. I don't know how to describe better... hmm. Swarming means to solve one basic problem of gnutella: busy slots and only few people sharing. The problem could be solved by spreading often requested files (NOT seldom queried files) over the network, so downloads will be happen more often and much faster. If you have any further question, just ask here again or meet me on IRC. It would be interesting if the Limewire developer would describe their plans on swarming. Anyone knows details? Greets, Moak PS: Super peers are mentioned in this thread Last edited by Moak; November 14th, 2001 at 08:54 AM. |
| |
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Slow Download Speed, Yet High Speed Internet | eclipse838 | Download/Upload Problems | 1 | August 1st, 2006 09:25 AM |
Question about download speed; used to higher speed on other apps. | AlexNR | Download/Upload Problems | 0 | February 12th, 2006 08:56 PM |
Why does it say Downloading from 2 host but the speed is at 0 kb\s? | WeZeL | Download/Upload Problems | 1 | October 1st, 2003 08:27 PM |
high speed modem, slow speed download | Unregistered | Connection Problems | 1 | May 30th, 2002 07:07 AM |
My Host Speed | peguido | General Discussion | 2 | July 3rd, 2001 04:01 PM |