![]() |
|
Register | FAQ | The Twelve Commandments | Members List | Calendar | Arcade | Find the Best VPN | Today's Posts | Search |
New Feature Requests Your idea for a cool new feature. Or, a LimeWire annoyance that has to get changed. |
![]() |
| LinkBack | Thread Tools | Display Modes |
|
| |||
![]() Hm, maybe you were dreaming of Gnucleus! Right now it has the best download resume I've seen so far (be sure to get the newest version). You can re-search for new results and it groups either by name or file size (you decide). The UI takes a little getting used to (and I wish it had LW's file type selection to limit searches), but I managed to download an 840 meg file over a period of a few days from about 20 different servers - and it actually worked when done! I doubt *any* of the other G-net client out there now could have pulled that off. |
| ||||
![]() Thanks for your kind replies...the problem is, I have had a second dream (a daydream this time, while having boring english courses) and I doubt now that the implementation of segmented downloading would be a good idea. I would rather have the servents stick to linear downloading even if they could do segmentation. I will explain it: Downloading on the Gnutella Network is hard. Very often you will find files that you cannot download; your servent only says 'waiting, retrying, pushing, error'. The problem is that many users are downloading much more files than they are uploading, and mostly they have limited their upload speed to far less then 50%. The upload slots, however, are limited and therefore there are far too less upload slots available on the Network (I really wonder why Napster seemed to have no such problems - could it be due to the larger user base ?). If techniques such as segmented downloading would spread over the Gnutella Network, it would be a great damage rather than progress. People would share no more than they do now, but they would consume two or three times as many upload slots. The result: It would become even harder to find available mirrors. As explained in my last post, segmentation only works if there are enough available Candidates. However, if many users use segmentation, they will dramatically decrease the number of available Candidates and therefore using segmentation only works if it is used by a small minority. Spreading on the Network, it wouldn't work any more. However, everything else I posted should work; LimeWire's and Phex's features really match like cogs in a clockwork. I do hope that Lime will implement a dynamic Download Candidate List in the next version. |
| |||
![]() Quote:
Also I don't think it would be any worse for uploads because while it's true the it would fill more upload slots, those slots would stay occupied for a correspondingly shorter period of time. On the whole the average number of slots available at any given time shouldn't decrease from such a feature. The hard part is coming up with a good strategy since host vary in speed so much and often abort before finishing. One idea I had was since there's no real penalty to the server if a client closes the connection prematurely (the server just closes also and gets on with life), you could try something like this... * First connection requests entire file * Second connection requests from 50% to end of file. * Third requests from 25% to end of file * Fourth requests from 75% to end of file. * And so on... Key point is all requests are made from some starting point always to the end of the file (rather than up to the next starting point). This way they can continue on if the other connection drops. Another feature of this is if a download thread hits the point where another one starts it can do one of two things a) just stop there or b) continue on if it's much faster than the other thread and would probably beat it to the end (the overtaken connection would be dropped). (note this is reposted from a different thread of mine in the Gnotella forum) |
![]() |
| |
![]() | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Simulate Gnutella v0.6: can i use existing code/libraries? | iasty | General Gnutella Development Discussion | 8 | October 4th, 2013 01:55 PM |
codecs to solve all your problems | skembear | Tips & Tricks | 1 | July 28th, 2005 09:38 AM |
Features needed by all Gnutella clients | bobbinson | General Gnutella Development Discussion | 10 | March 23rd, 2002 03:18 PM |
Combination of existing features would solve most Gnutella download problems | Abaris | Site Feedback | 4 | June 19th, 2001 08:07 AM |
Best Gnutella source -- features / stability | s9000 | General Gnutella / Gnutella Network Discussion | 0 | August 1st, 2000 11:49 PM |