Gnutella Forums

Gnutella Forums (https://www.gnutellaforums.com/)
-   Download/Upload Problems (https://www.gnutellaforums.com/download-upload-problems/)
-   -   Here we go again: Return of the Incredibly Annoying Corrupted Download.dat (https://www.gnutellaforums.com/download-upload-problems/21722-here-we-go-again-return-incredibly-annoying-corrupted-download-dat.html)

marvin_arnold September 7th, 2003 03:36 PM

Here we go again: Return of the Incredibly Annoying Corrupted Download.dat
 
LW 3.5.2
Mac OSX 10.2.6

At least I thought they had finally solved the problem with disappearing downloads due to a corrupted downloads.dat file.
It seemed so easy:
The program does a backup (imagine!: on its own!! (sorry for being sarcastic)) of that ****ing file that causes so much grief and trouble
and when a Big Bad Crash occurs, we still have a nice and tidy download.bak do rely upon.

It worked well, at least before I upgraded to 352. Since then, almost
every time I close down (in the friendliest possible way, without forcing anybody to quit or pulling the power plug or shouting swear words at the screen) the prog seems to "forget" to write the .dat file as well as the .bak file.
I have tried to "resurrect" older .dat/.bak files with Norton UnErase, which has worked twice but never again and now I sit there with half a GByte of partial files and a Gnutella client with amnesia.

I tried to rename .bak files as .dat files, to offer the .bak file only, with no other result than a blank downloads window.
Is there any way to retrieve a .dat file from the .bak file? I gather there is no other way to resume a partial download without the .dat file.

A possible solution: Wouldnt it be great if LW didnt write the .dat and .bak file at the same time? So that after a crash (which apparently happens every time I quit LW) we would have at least one of them?

Is there a light at the end of the tunnel or is it just an approaching train?
So many questions, so little time...

cheerio!
M. A.

sberlin September 7th, 2003 08:23 PM

Hi Marvin,

The logic of the dat/bak is thus:

Every 30 seconds or so, LimeWire attempts to write out a downloads.dat file with the newest information of what has just been downloaded. Just before writing the newest file out, LimeWire will rename the current downloads.dat to downloads.bak. If there are ANY problems with writing out the newest downloads.dat, the downloads.bak file is renamed back to downloads.dat.

When LimeWire is started, it attempts to read the downloads.dat file. If it does not exist or the read fails for some reason, it attempts to read the downloads.bak file. If the downloads.bak read succeeded, it is renamed to downloads.dat.

If you have suggestions on how this process can be improved, we'll be glad to hear them. The solution, however, is not to write out a seperate downloads.bak file, because the chances are that if something fails with writing downloads.dat recently, then it will also fail with the downloads.bak.

Thanks.

marvin_arnold September 8th, 2003 03:31 AM

thanks sberlin for replying so fast.

it seems to have something to do with the size of the download.dat, ie
the amount of downloads/requests that LW is handling at the moment
you quit. in the last days, i had tended to keep a lot of downloads/
requests active, so the size of my last sucessfully saved .dat/.bak
files was something around 90 kB. Maybe, when shutting down, LW is
unable to process such a large file?

unfortunately, i am not a developer. I can just guess.

Thanks again,

M: A:

marvin_arnold September 8th, 2003 01:09 PM

Size does matter! (Update)
 
It seems that when the downloads.dat file gets larger than about
90 kB LW doesnt write it (and the bak file) down at all, no matter
if its closing down or during operation.

90 kb means that there are about 90 running downloads/requeries/
awaiting/need sources etc. in the downloads window.

It looks like the amount of "could nor move to library" also increases
with the amount of items in the downloads window.

Any ideas anybody?

Have fun,
M: A:

sberlin September 8th, 2003 01:26 PM

Hi Marvin,

Thanks for the detailed information. We'll look into what could be causing a problem with the larger amount of downloads. Most likely the problem is that 90kb is simply too much information to gather and write out that quickly. I'll post back here if we find anything that could solve the problem.


All times are GMT -7. The time now is 11:19 PM.

Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2025, vBulletin Solutions, Inc.
SEO by vBSEO 3.6.0 ©2011, Crawlability, Inc.

Copyright © 2020 Gnutella Forums.
All Rights Reserved.