![]() |
|
Register | FAQ | The Twelve Commandments | Members List | Calendar | Arcade | Find the Best VPN | Today's Posts | Search |
New Feature Requests Your idea for a cool new feature. Or, a LimeWire annoyance that has to get changed. |
![]() |
| LinkBack ![]() | Thread Tools ![]() | Display Modes ![]() |
| |||
![]() You start a download or two, maybe have multiple hosts pumping data in your direction, then you head away from the computer for dinner, or to sleep. You return, to find that origonal hosts have either gone offline or have stopped responding. Limewire SHOULD be smart enough to notice this and initiate another search, but instead, you are greeted with "Awaiting sources," or notice that you are downloading at a bitrate of zero, forever. All the user need do is tell the search window for that file to repeat the search and the download may not only resume, you may find several hosts jumping on the wagon. Someone at Limewire seems to have forgotten that computer programs are supposed to have built in intelligence. How hard would it be to have Limewire notice that the download has either stalled or lost its host and reinitiate the search for itself? How hard would it be to check, now and then, to see if another host has appeared that might shorten the download process for that file? It would add little traffic, because, now, you have people constantly clicking on the repeat search button to make up for this failing in the program. And while I have you on the wire: If I use the right click on a given file search window's tab, to bring up the options menu, it will NOT dismis unless I click on the same tab a second time. Click on the window to try to dismiss the menu, as is done for most programs, and nothing happens, except that you must now click twice on the tab—once to attract the attention of the program and once to cancel the menu. This is just sloppy programming, and should be fixed. JayG |
| |||
![]() So let me se if I have this straight... Having Limewire do a search when the host it's downloading from is lost would somehow flood the system with searches. Yet the users who sit there requesting search after search simply to "keep the ball in play," don't. Well the solution is simple if we take your comment to its logical conclusion: Ban all searches and there will be no traffic at all. Am I being a smart-***? Certainly. But if the point of downloading is to actually get the file, and you care about doing it quickly... One of the things a forty-year long career in computer design has taught me is that saying, "We've never done that," is hardly the way to be in the forefront, product-wise. Jay |
| |||
![]() If I remember the many heated discussions last year when the devs had to reluctantly agree to give up automatic searches , I think they had to be ALL banned because there was no way to tell the one or two legit retries from the abusers. search for posts by partial poster name trap_jaw and search term "automatic requeries" for more details about the coding probelms and trials. If you can read CVS logs, try these links http://core.limewire.org/servlets/Br...ev&paged=false http://core.limewire.org/servlets/Br...vs&paged=false sure would be good to see efficient automatic researches. LW is open source: you can build your own CVS version, but others ban certain levels of retry. |
| |||
![]() I don't know the first thing about network traffic, but what if LW could somehow sense a lull in volume, and send its queries then? Or is there another way to query that's fundamentally different than what's currently in use? |
| |||
![]() I think that even if LW was to do an automatic research on failed d/l's every hour or so, then that would be sufficient. There have been many times I've gone to sleep or work, came home, checked the status of my queues and found that they haven't even started (which is a real kick in the *ahem* when I only queue search results that yield 5+ results). Researching once every hour would not load up the server a whole lot would it? |
![]() |
| |
![]() | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Help Meee Please.......I'm dying here | herman | General Mac OSX Support | 1 | October 27th, 2004 07:09 PM |
Gnutella is dying | sdsalsero | Open Discussion topics | 3 | January 15th, 2003 07:19 PM |
Is Gnutella Dying? | EvicerateX | General Gnutella / Gnutella Network Discussion | 18 | September 19th, 2002 08:23 AM |
Guntella dying? | Dennis | Open Discussion topics | 6 | August 21st, 2002 01:24 PM |
56k with clear cache. What can I expect? Many downloads are dying incomplete at 99% | LNSTRLEE | Download/Upload Problems | 0 | July 22nd, 2001 12:15 AM |