|
Register | FAQ | The Twelve Commandments | Members List | Calendar | Arcade | Find the Best VPN | Today's Posts | Search |
General Discussion For anything which doesn't fit somewhere else (for PHEX users) |
| LinkBack | Thread Tools | Display Modes |
| |||
Loosing download candidates... Hi. When downloading multiple files from the same host it happens to me very offen, that few of them start to download and the rest looses the download candidate on this host. I have to go to search window and add the candidate (the same file from the same host) manualy. When first files comlete, the ones I added manualy start to download without any problem. When I have for example 10 files I didnot find anywhere else and the host has max. uploads from same host set to 2, I have to readd the same files up to 10 times depending on download speed. For more files this could be much worse. What happens? J. |
| |||
Can you check or post the log screen content of a download where this happens? You can copy it by pressing CTRL-c and paste it with CTRL-v It might be that the hosts are dropped because Phex can't connect to them directly or because the PUSH route to the host has expired. Gregor |
| |||
Same here on the latest version Used to be that the options for each file would keep at least one kandidate now with 0.6.1 i have to manually seach (and the files seem to be there) and then there is the bug that I need to add them one by one with 'Add download candidate' (selecting groups never worked) and each time it also switches screens (which is no problem on a machine as fast as mine - but it is inconvient anyway) At any rate used to be that retries would count up to the hundrets ~ now all candidates are lost after the first try ! How come the new phex could be so sure that candidate will not answer sooner or later ? I suggest (instead of just deleting the candidate references) to devise a system that rates the candidate and to retry the poor ones less and to search again automaticaly when faith in all candidates is lost (only poor options) This way a system could use way less search bandwith and saty in lurk and retry mode even waiting till a candidate that had gone offline comes back the next day all without generating much extra search traffic !!! This would overall increase the efficiency of the Network ! Anyhow thanks for your time ! |
| |||
Ok I verified it ists truely a problem ... I had found this series of scientific talks Two of tham at the same host For luck the first one started immediately ... The second listing the same IP - I guess tried - and failed and was without candidate after less than 2 minutes While the first was still downloading from the same IP That sounds mayorly wrong doesn't it ? Any recomendations in which file of the source or XML I should be able to fix that ? cheers togo |
| |||
Ok I suspect it in src/download/DownloadFile.java /** * The HOST_BUSY status is handled like an error status currently */ public static final int HOST_BUSY = 6; But I better be honest - I got a lot of other stuff prioritized ... And it is Code written by others - I don't know how long it would take me to see through it ... But lets Please sombody who allready 'Gets it' make a fix |
| |||
OK guys... there might be a bug in it that I don't know. But every remove of a download candidate is clearly logged in the little log screen. If you see any removel that you think is not right please post me the screen. But you can be sure that no host is removed if he returns a host busy signal 503. Host will be removed if they return 404 or 410 which means the file is not shared ( anymore ). Also host will be removed if they can't be reached with a connection try and a PUSH. That is because we don't know if the host is firewalled. If we would keep him in the list there is a very big chance you will never reach him. There are also some other severe errors where we drop the host since Phex is not understand the host very well. ;-) Ok there can be some optimizations done here or there. But the thing is we have to rewrite the whole download engine for swarming anyway. So I suggest if you find problems send me the log so we can fix them quickly or just stick with the bugs or the not so very good rating of download candidates and we talk about it again after we released swarming. Gregor |
| |||
Ok I have been checking the log frequently and havent found anything that indikates reason for my suspicion - it seems that it is the hosts that did not answer that get removed ... Could it be a question of the timeout - or that their line is to buisy ? I mentioned that I had found two individual files on the same host and one started right away - but the other lost the same host and had to as a result generate unnecessary Search traffic if it wanted to get the file again... What surely can be said is that the efficiency of Phex within the last few versions has gone down a few 100 percent and I suspect the networktraffic up the same factor !!! It is wrong to drop a Host from the list for not responding one to x times ... Also next I am going to post this in the Host catcher thread: I would like to see a filter cause I am suspecting that my default port is blocked here in the University - so why waste so many atempts on hosts with blocked ports from my catcher when reconecting ? when I choose one of the random ports it connects just fine - If I dont it'll take forever to reconnect ! Vielen Dank Greg and all !!! |
| |||
What would be Lux If there was a Menu to select for which type of Malrespondance a downloadcandidate should be 'forgotten' - Maybe even with a counter and the option to say if 1-n times Error X then ... Object oriented Programming is soo Cool ! I have the serious intention to get familiar with the Phex Objects too - right after I get my Priority Project onto that desired Plateau ! Everybody You People are the Wildest ! |
| |||
Quote:
But this whole thing is not working if either your host or host A decides in the meantime to drop its neighbor A or B. Then the route is lost and can only be reestablished with a new query. And if the other servant is many hops away your chance to reach it will get smaller and smaller with every hop. In your case maybe one of the PUSH routes was still valid but in the meantime the it broke before the request of the second file could get through. Gregor |
| |||
Curious Behaviour coincidentaly noticed ! Check this out this was the situation I had 2 candidates that staide in my list for several retries Then I manually added one (24.186.174.241:6346) and when it failed and got removed the second (http://80.132.186.12:8080) disapeared at the same time even though it wasnt even the current (that was the first now(http://213.139.138.213:5635) that continued to retry ...) I coincidentaly saw that while checking the log ... Here is an excerpt: =================== Connect http://213.139.138.213:5635/get/173/Final Fantasy DVD rip DivX.avi Normal connection ok Send download handshake: GET /get/173/Final Fantasy DVD rip DivX.avi HTTP/1.0 User-Agent: PHEX 0.6.1 (release) Range: bytes=279687640- Remote host replies: HTTP 503 UPLOAD LIMIT REACHED Remote host is busy. Start download. position to read=279687640 Download name=D:\Download\Final Fantasy.avi.dl Connect http://80.132.186.12:8080/get/275/final-fantasy (engl) (divx).avi Normal connection ok Send download handshake: GET /get/275/final-fantasy (engl) (divx).avi HTTP/1.0 User-Agent: PHEX 0.6.1 (release) Range: bytes=279687640- Remote host replies: HTTP/1.1 503 SERVER BUSY Remote host is busy. Start download. position to read=279687640 Download name=D:\Download\Final Fantasy.avi.dl Connect http://24.186.174.241:6346/get/7/Fin...p-Divx-TDF.avi Normal connection failed. Operation timed out: connect Try push request. 7:1BC9C5076962E89DFF495BC102E3AF00 Wait for connection from sharing host Time out waiting for connection from sharing host Error: java.io.IOException: Time out on requesting push transfer. ======================== Also I am noticing now it seems the adminitration people blocked the default port I don't get anything on 6346 anymore ! Could you set it up so all future phex clients default to random Ports ? Thanks T |
| |
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Loosing IP When I Download | AlphAzero14 | Connection Problems | 15 | February 17th, 2007 10:25 AM |
candidates | blueseawarm | General Discussion | 1 | June 13th, 2005 08:36 AM |
Download Candidates | franki | General Discussion | 0 | September 26th, 2004 09:09 PM |
Download Candidates | DrWho | General Discussion | 6 | August 13th, 2004 06:34 AM |