![]() |
|
Register | FAQ | The Twelve Commandments | Members List | Calendar | Arcade | Find the Best VPN | Today's Posts | Search |
General Gnutella / Gnutella Network Discussion For general discussion about Gnutella and the Gnutella network. For discussion about a specific Gnutella client program, please post in one of the client forums above. |
![]() |
| LinkBack | Thread Tools | Display Modes |
| |||
![]() A feature that would be interesting is to be allowed to download the same file from multiple places at the same time as one file. It could increase the speed at which a file is downloaded or the chance that the file would be downloaded completely in one swoop incase one person disconnects while you're downloading. Like how Limewire and perhaps other clients I haven't tried groups the same file together, using the same method to be able to download them all together. I think this feature would be great for large files. Maybe this feature is already implemented and I'm just unaware. I don't know if this would be possible or the complexity or issues that may arise from being able to do such a thing, but I personally think it would come in handy. |
| |||
![]() You want it all and you want it now. If you sit behind the client all day then I understand what the problem is. Bearshare is notorious for not wanting to add any automated features, people have begged and pleaded for some and it doesn't happen so get rid of it. Use a more automated client and let your downloads come in overnight, plus you are halping the network by sharing out all night. Yes you have to wait till the next day, but if you do it nightly you will have so much stuff you won't know what to do with it. You should see all the stuff I get at 56k, it's overload in the morning and I share back lots too. Resuming a download is about the same thing, if one server drops you just go to the next. At this point in the network it would be harmful because it's hard enough to get one connection, let alone two or three, and you would tie up all the available servers for that file (abeit for a shorter period) and I bet you would then go on to tie up more with another filename so the argument that you would then be done and go away is flawed. I say it's a bad idea. It doesn't work for the web, servers are massively throttling now because of it. Greed is never good. Tailgating on the freeway never gets you anywhere either. On the other hand, you can do it now, just write your own client and request the first part of a file, then request from another server the same file plus 1/4 the file size, then request another and another and hope it all comes together. So go get to coding, lots of source code out there. If you want it right now this minute that is. |
![]() |
| |
![]() | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Why Can't I download From Multiple Sources Anymore? | trapspringer | Download/Upload | 0 | November 14th, 2005 05:12 PM |
red locations and multiple sources | golisten2metal | Open Discussion topics | 0 | July 18th, 2005 08:20 PM |
browsing multiple sources | jordan2 | General Mac Support | 3 | July 4th, 2004 07:10 AM |
downloading from multiple sources??? | sijp | General Gnutella / Gnutella Network Discussion | 1 | April 6th, 2004 12:34 PM |
resume = multiple sources possible?? | portchop | BearShare Open Discussion | 4 | September 8th, 2001 02:04 AM |