![]() |
|
Register | FAQ | The Twelve Commandments | Members List | Calendar | Arcade | Find the Best VPN | Today's Posts | Search |
New Feature Requests Your idea for a cool new feature. Or, a LimeWire annoyance that has to get changed. |
| LinkBack | Thread Tools | Display Modes |
| |||
![]() My opinion is that auto propogation of files is going to be a huge thing in p2p networks. I would like to see limewire make an effort to make it's implementation of the gnutella network intelligent enough to recognize the popularity of files, and make sure those files are available in relative proportions to their demand, without the intervention of the user. What I'm suggesting is very similar to freenet in some regards. I think it would be nessicary, and very beneficial for the client to claim an amount of the users drive space, and manage it for the user. Allowing the client to manage what files are being uploaded is a huge step in managing the uploads on the network in the most efficient manner. The files would be broken into chunks, organized by hash, and weighted by popularity. The files would also be accessable by hash, a feature I think is not available in freenet. Due to the nature of the storage, plausable denyability will allow a further defense against legal action for using the network. If the gnutella network keeps track of upload/download ratios, the network could be intelligent enough to set up a heirarchal distribution network for files where the demand is disproportionate to the supply... taking an upload spot from someone who can only send out 12kps, and putting someone with 24kps upload in their place, to serve the original downloader, as well as another. I imagine this being implimented. I imagine perhaps new redhat ISOs hit the network and the original seed may only have 50k/sec upload. Hardly enough to serve the community, however, the hash being known for the file, it's very highly in demand... so much so that it's the most highly requested file on the network at that time. I imagine that users who have huge amounts of bandwith would automatically connect up with the user with the redhat ISOs, automatically stop sharing their less popular files in their community share, and begin sharing out redhat. If the first round of helpers don't have the bandwidth to support the community, another level of upload relays could be added, and so on, until supply can be proportionate to demand. Within 10 minutes of posting the hash, a user with 50k/sec upload may be at the top of an upload chain that might be serving out 100 times that much data. Everyone could be getting a speedy download, and everybody would be happy. Freenet is nearly impossible to publish to, and is exceptionally slow as well. Bittorrent is also more difficult to publish to than I have become used to. I expect to be able to drop a file in a shared directory, and it should be scanned and picked up by my file sharing client. This would allow popular files to migrate from anyone's shared directory to their community share effortlessly. I posted a question about changing the way that queues are handled in the "open discussion forum". I asked about instead of waiting in line, round robinning out the data, which would probably be beneficial if implimented into a system such as I have described in this post. I think this could even be implimented on top of the gnutella network without altering the protocol, and still communicate with plain vanilla gnutella clients seamlessly, as the files would still be sent and recieved in the same way, even if it is not strict adherance to the protocol (I don't know if this breaks adherance to the protocol.) People suck at managing their uploads. The client software could do a much better job. The strengths of the gnutella network, the popularity of the LimeWire client, and these ideas together, could revolutionize file sharing. -Sgt. Stedenko |
| |