View Single Post
  #18 (permalink)  
Old October 19th, 2003
rockkeys rockkeys is offline
Devotee
 
Join Date: September 30th, 2003
Posts: 27
rockkeys is flying high
Default I agree, this is a very harmful change...

At least when downloading a large file from only one or two sources. I think the developers forget that a file that returns 200 sources really means that maybe 3 sources exist that can be connected to. The others are busy, long gone (but their data persists), or have such large queues that it would take days to get an active slot.

Too many times have I seen (queued: slot 97 of 100)-type messages. If I leave a query running during the download, I may eventually turn up 200 or 300 sources, but gnutella says that only 22 or them are alive, and all of them are busy, and we'll retry them later (and get another busy, and so on).

The congestion on the Gnet is abysmal, and is getting rapidly worse. It looks like way too many people are downloading, but not sharing what they download, or what they have of their own content.

Yet as far as I know, not a single software developer group has made any effort to give guidelines of how many upload vs download slots should be active, and what the real consequences will be if these guidelines are ignored. In fact, it's pretty clear that the people who write the software have not tried to really use it the way the typical user does, or they'd realize that mostly, Gnet doesn't work for large files. It probably works OK for the popular MP3 junk out there, but for multi-hundred MB files, too many things prevent the user from ever getting a good download.

I'm not pointing at any specific client, as it's the network/users themselves that are causing the problems. Too many must be greedy of bandwidth, and just won't share.

So along comes joe_user, and he wants the entire contents of some current music disk, so he searches for the files, selects them all for download, then goes back to browsing. He notices that the thruput is terrible with all this sharing going on, so he turns off his uploads.

Multiply this times about 50% of the users, and you can see why people are irritated. Yet there has been a distinct lack of noise from the powers that be in Gnutella development about this, or what to do to fix it, other than quiet messages that say 'don't forget to share' in their posts.

We really need every single gnet client group to post a strong recommendation of the upload/download ratios, and an explenation of what will happen if those guidelines are ignored. The average user has no idea of what affect his settings will have on the network, and generally doesn't care, or even think about it much. Because no one ever bothered to tell them it matters, and matters alot.

With our current situation, and the clients that do disconnect after 97K transfers, we can't get files anymore. I've gone back to IRC to get my anime files, because I can download an entire 200MB video clip in about 45 minutes, and it's a direct connection to a fast server. I have never (hear me, guys? NEVER) been able to download a complete anime video on Gnet. Fasttrack works, but is also getting much slower, just like Gnet. However, I think Gnet has more users, and is suffering more from the situation because of this.

Seriously, I've found numerous posts to the gnutella forums, asking for guidelines in setting up the connection ratios, and not one single person got a direct and satisfactory answer. It's pretty clear to the users that no one cares enought to answer these questions, so they set them to the defaults, or tweak them as they please.

To fix this mess, we really need more open slots than requests, or it will never dig it's way out. And unless the authoritative people address the situation in a clear and understandable way, Gnet will deteriorate into an unusable mess, with the exception of the 2-4mb MP3 file with 10,000 active sources. Do we want gnet to become this? I sure don't, but it's headed that way very quickly. And everyone (except the MP3 users who don't really see the problem) pretends that nothing is wrong, and it's the individual user's setup, or their poor connection to the net, or something else that is wrong.

We need to stop ignoring the problem, and do something about it. I leave my system on for hours to make my files available, but I think I am the exception. I have more than one computer, and can easily leave a system up to share anime. It's even legal content, unless the episodes have been released for US television, which luckily is almost none of them. But a lot of people want them, as my systems are always completely filled up with requests, and my queue can get pretty deep too.

If the 'industry' won't get together and take some action about this, the net will become more and more unusable for the average user. God help the poor fellow with a 56K modem. Unless he finds a source like my system, he's hosed if he wants anything other than a tiny file.

We can't do anything about people not sharing, or turning their systems off when they are finished downloading their own data. But we need to advise the people who are trying to make the net 'work' that they need to offer more upload slots than they download at any one time, or they will never see reasonable thruput and reasonable source counts that are active.

The situation is so bad, that I'd guess that we need three or four times the number of uploads to downloads, in order to offset the people who just won't share. If people throttle their upload bandwidth, fine, but make the slots available. If enough do that, they will get fine aggregate thruputs, and no one will suffer as a result. It would be a lot better if they all offered a large number of slots, and as much thruput as they can afford to give. But getting the message out, so that the current trend which is killing the network can start to turn around, is the responsibility of the 'industry' leaders. And they have ducked the issue for too long now.

Sorry for the soapbox, but the irritation is rising to a level where I had to say something. It's clear others feel the same way. Will someone step up and make the effort to address this, or will we continue to ignore the problems?

If I had the equipment to measure the status of the net, I'd publish the results everywhere I could in order to get someone to address this. I'm sure some of the development groups do have the equipment and knowledge to do that, but they are too busy to bother with what's really happening on our network. I think they have their priorities wrong. And I think that within another year, Gnet will be useless if this trend continues. There needs to be an over-adjustment in the opposite direction, or we will never overcome the problems caused by the careless or ignorant user.

--Rockkeys
Reply With Quote