Gnutella Forums  

Go Back   Gnutella Forums > Current Gnutella Client Forums > LimeWire+WireShare (Cross-platform) > New Feature Requests
Register FAQ The Twelve Commandments Members List Calendar Arcade Find the Best VPN Today's Posts

New Feature Requests Your idea for a cool new feature. Or, a LimeWire annoyance that has to get changed.


Reply
 
LinkBack Thread Tools Display Modes
  #1 (permalink)  
Old June 14th, 2001
Abaris's Avatar
Ringwraith
 
Join Date: May 13th, 2001
Location: Europe
Posts: 86
Abaris is flying high
Lightbulb Combination of existing features would solve most Gnutella downloading problems

I had a dream in which I searched the Gnutella Network for a file with LimeWire and I got a number of Groups as results. Having selected a group containing lots of files and a good number of four star downloads, the tab changed and I was on the Phex download window. All the elements of the Smart Download Group, as they all have the same file size, were automatically added as Download Candidates. No I woke up and I am able to see the great progress to both LimeWire and Phex that was revealed in my dream:

LimeWire's Smart Downloads behave poorly, due to the fact that LimeWire is unable to search for *new* sources to resume a download that has once been started. Most often downloads fail because the uploading user has disconnected from the Gnutella Network. As most users are not permanently connected to the Internet and as most ISPs assign dynamical IPs, the user's IP address is likely to have changed when he rejoins the Gnutella Network and therefore a partial download will almost never be resumed when trying to connect to old download sources only.
Phex offers a much better feature: It handles a dynamic 'Download Candidate' List for each download. When the list elements are out of date, the Gnutella Net is automatically researched for the file using the original search term. Each result that exactly matches the filesize is added to the list of Download Candidates. The search term can even be edited by users and Candidates can manually be added to and removed from the list. Phex also searches for new Candidates silently in the background. This way one is able to get new usable IPs instead of unusable out-of-date IPs.

However, when you start a search you generally get lots of results that have slightly different file sizes. Although three kilobytes more or less are completely meaningless when audio or video quality is concerned, it makes it impossible for these files to interact as Download Candidates for each other. The problem is that Phex offers no way to determine how many Candidates are likely to be found for each file that is returned in the search results.
The Solution to this problem are LimeWire Smart Download groups that have been reconfigured to group files by size rather than by name (grouping by filename is ineffective for Smart Downloading anyway). When you get two groups of which one contains 20 elements and one contains 5 elements, when you download both groups (using the Phex Candidate way), stop them after you have the half of both files, and research for new Candidates even months later, you are likely to find about 20 new Candidates for the first file and about 5 new Candidates for the second file again, because the average number of available mirrors is likely to remain constant regardless of your unique network horizon. So you have a way to determine the number of available Candidates before you start a download.

Combining these two features with the LimeWire four star rating system would solve all downloading and resuming problems the Gnutella Network is currently suffering from. You would know the quality of a download before you start it, you would know how many mirrors there are on the Network (and this number is likely to remain constant even over longer periods of time), and you would be able to search for new IPs when the old ones are out of date (which is a matter of days and even hours). The only remaining problem would be to improve the download speed. And even this could be done quite easily:
Handling a dynamic Download Candidate List for each file is not only a great way to resume files. It would also make the implementation of 'segmented downloading' very easy. Imagine you have an ISDN internet connection that allows you to download 8 kilobytes per second. Now imagine you have two Download Candidates: one offers a rate of two kilobytes per second and the other one offers a rate of three kilobytes per second. Downloading the first half of the file from the first source and the second half from the second source at the same time would give you a combined speed of five kilobytes per second and thus would almost double your download speed. This mechanism is used in download managers like GetRight and is likely to grant you your full internet connection speed for each download if you have enough available Download Candidates in your List.

As we all dream of better Gnutella Network performance I hope that these technics will be combined in a future client, Phex or LimeWire or any other one, of course preferrably written in Pure Java to also grant the Gnutella Community true Platform Independance. It would make life so much easier for everybody.

Last edited by Abaris; June 14th, 2001 at 07:14 AM.
Reply With Quote
  #2 (permalink)  
Old June 14th, 2001
scud
Guest
 
Posts: n/a
Default

That one hell of a dream man.
Reply With Quote
  #3 (permalink)  
Old June 18th, 2001
Unregistered
Guest
 
Posts: n/a
Default

1 day itll happen but im not holdin my breath
Reply With Quote
  #4 (permalink)  
Old June 18th, 2001
Unregistered
Guest
 
Posts: n/a
Default

Please read the feedback forum!!
Reply With Quote
  #5 (permalink)  
Old June 19th, 2001
Novicius
 
Join Date: June 19th, 2001
Posts: 2
kwantem is flying high
Thumbs up Excellent...

All excellent ideas -- and all should be fairly easy to implement too. I hope you've sent your comment directly to the Limewire folks if you haven't already!
Reply With Quote
  #6 (permalink)  
Old June 19th, 2001
SRL SRL is offline
Gnutella Veteran
 
Join Date: March 23rd, 2001
Posts: 144
SRL is flying high
Default

Hm, maybe you were dreaming of Gnucleus! Right now it has the best download resume I've seen so far (be sure to get the newest version). You can re-search for new results and it groups either by name or file size (you decide). The UI takes a little getting used to (and I wish it had LW's file type selection to limit searches), but I managed to download an 840 meg file over a period of a few days from about 20 different servers - and it actually worked when done!

I doubt *any* of the other G-net client out there now could have pulled that off.
Reply With Quote
  #7 (permalink)  
Old June 20th, 2001
Abaris's Avatar
Ringwraith
 
Join Date: May 13th, 2001
Location: Europe
Posts: 86
Abaris is flying high
Default

Thanks for your kind replies...the problem is, I have had a second dream (a daydream this time, while having boring english courses) and I doubt now that the implementation of segmented downloading would be a good idea. I would rather have the servents stick to linear downloading even if they could do segmentation. I will explain it:

Downloading on the Gnutella Network is hard. Very often you will find files that you cannot download; your servent only says 'waiting, retrying, pushing, error'. The problem is that many users are downloading much more files than they are uploading, and mostly they have limited their upload speed to far less then 50%. The upload slots, however, are limited and therefore there are far too less upload slots available on the Network (I really wonder why Napster seemed to have no such problems - could it be due to the larger user base ?).

If techniques such as segmented downloading would spread over the Gnutella Network, it would be a great damage rather than progress. People would share no more than they do now, but they would consume two or three times as many upload slots. The result: It would become even harder to find available mirrors. As explained in my last post, segmentation only works if there are enough available Candidates. However, if many users use segmentation, they will dramatically decrease the number of available Candidates and therefore using segmentation only works if it is used by a small minority. Spreading on the Network, it wouldn't work any more.

However, everything else I posted should work; LimeWire's and Phex's features really match like cogs in a clockwork. I do hope that Lime will implement a dynamic Download Candidate List in the next version.
Reply With Quote
  #8 (permalink)  
Old June 20th, 2001
Unregistered
Guest
 
Posts: n/a
Default

Quote:
Originally posted by Abaris

If techniques such as segmented downloading would spread over the Gnutella Network, it would be a great damage rather than progress. People would share no more than they do now, but they would consume two or three times as many upload slots. The result: It would become even harder to find available mirrors. As explained in my last post, segmentation only works if there are enough available Candidates. However, if many users use segmentation, they will dramatically decrease the number of available Candidates and therefore using segmentation only works if it is used by a small minority. Spreading on the Network, it wouldn't work any more.
Actually, if you're talking about simultaneous downloads from multiple hosts, I think that's a great idea (and something I've been expecting to happen). It's the next logical step once you have multiple servers grouped for a file.

Also I don't think it would be any worse for uploads because while it's true the it would fill more upload slots, those slots would stay occupied for a correspondingly shorter period of time. On the whole the average number of slots available at any given time shouldn't decrease from such a feature.

The hard part is coming up with a good strategy since host vary in speed so much and often abort before finishing. One idea I had was since there's no real penalty to the server if a client closes the connection prematurely (the server just closes also and gets on with life), you could try something like this...

* First connection requests entire file
* Second connection requests from 50% to end of file.
* Third requests from 25% to end of file
* Fourth requests from 75% to end of file.
* And so on...

Key point is all requests are made from some starting point always to the end of the file (rather than up to the next starting point). This way they can continue on if the other connection drops.

Another feature of this is if a download thread hits the point where another one starts it can do one of two things a) just stop there or b) continue on if it's much faster than the other thread and would probably beat it to the end (the overtaken connection would be dropped).

(note this is reposted from a different thread of mine in the Gnotella forum)
Reply With Quote
Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
Simulate Gnutella v0.6: can i use existing code/libraries? iasty General Gnutella Development Discussion 8 October 4th, 2013 01:55 PM
codecs to solve all your problems skembear Tips & Tricks 1 July 28th, 2005 09:38 AM
Features needed by all Gnutella clients bobbinson General Gnutella Development Discussion 10 March 23rd, 2002 03:18 PM
Combination of existing features would solve most Gnutella download problems Abaris Site Feedback 4 June 19th, 2001 08:07 AM
Best Gnutella source -- features / stability s9000 General Gnutella / Gnutella Network Discussion 0 August 1st, 2000 11:49 PM


All times are GMT -7. The time now is 08:46 AM.


Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2025, vBulletin Solutions, Inc.
SEO by vBSEO 3.6.0 ©2011, Crawlability, Inc.

Copyright © 2020 Gnutella Forums.
All Rights Reserved.