Quote:
It is actually rather easy to fake a full file hash, just lie!
|
I did not doubt that, but do you believe there will be clients creating fake hashes? A malicious attacker could sabotage the gnutella network much easier than by serving files with fake hashes.
Quote:
[...]
Then, if there is a problem it doesnt know if it was just a mistake in transfering data. If the file was multi-sourced then there is no way to know which of the many clients it downloaded from lied.
|
It is possible to identify the clients that lied by letting the downloads overlap by a couple of bytes (that's even easier with HTTP1.1), - but I doubt we will have to worry about that. Who would do so? I mean I think it's even illegal under the DMCA!
Quote:
This is a MAJOR vulnerability with the current gnutella network.
|
There are other vulnerabilities in the gnutella network.
Quote:
One rouge client could search the net, find the size and hash of files, and then use the same file size and hash to respond to ALL queries it can, send garbage data as just a small part of a swarm and destroy thousands or possibly millions of file transfers with minimal bandwith usage.
|
But that's a rather unlikely scenario and with download overlapping you could recover from that.
I say just give simple partial sharing a try, to see if tree hashes are really necessary. This simple kind of partial sharing could be ready in a month without tedious discussions in the GDF. Your kind of partial sharing will need half a year at least until it's implemented.